33 resultados para Modelagem de processos
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
A fragilidade brasileira quanto à competitividade turística é um fato observável nos dados da Organização Mundial do Turismo. O Brasil caiu em 2011, da 45ª para a 52ª posição, apesar de liderar no atributo recursos naturais e estar colocado na 23° em recursos culturais. Assim, grandes interesses e esforços têm sido direcionados para o estudo da competitividade dos produtos e destinos turísticos. O destino turístico é caracterizado por um conjunto complexo e articulado de fatores tangíveis e intangíveis, apresentando alta complexidade, dados de elevada dimensionalidade, não linearidade e comportamento dinâmico, tornando-se difícil a modelagem desses processos por meio de abordagens baseadas em técnicas estatísticas clássicas. Esta tese investigou modelos de equações estruturais e seus algoritmos, aplicados nesta área, analisando o ciclo completo de análise de dados, em um processo confirmatório no desenvolvimento e avaliação de um modelo holístico da satisfação do turista; na validação da estrutura do modelo de medida e do modelo estrutural, por meio de testes de invariância de múltiplos grupos; na análise comparativa dos métodos de estimação MLE, GLS e ULS para a modelagem da satisfação e na realização de segmentação de mercado no setor de destino turístico utilizando mapas auto-organizáveis de Kohonen e sua validação com modelagem de equações estruturais. Aplicações foram feitas em análises de dados no setor de turismo, principal indústria de serviços do Estado do Rio Grande do Norte, tendo sido, teoricamente desenvolvidos e testados empiricamente, modelos de equações estruturais em padrões comportamentais de destino turístico. Os resultados do estudo empírico se basearam em pesquisas com a técnica de amostragem aleatória sistemática, efetuadas em Natal-RN, entre Janeiro e Março de 2013 e forneceram evidências sustentáveis de que o modelo teórico proposto é satisfatório, com elevada capacidade explicativa e preditiva, sendo a satisfação o antecedente mais importante da lealdade no destino. Além disso, a satisfação é mediadora entre a geração da motivação da viagem e a lealdade do destino e que os turistas buscam primeiro à satisfação com a qualidade dos serviços de turismo e, posteriormente, com os aspectos que influenciam a lealdade. Contribuições acadêmicas e gerenciais são mostradas e sugestões de estudo são dadas para trabalhos futuros.
Resumo:
A modelagem de processos industriais tem auxiliado na produção e minimização de custos, permitindo a previsão dos comportamentos futuros do sistema, supervisão de processos e projeto de controladores. Ao observar os benefícios proporcionados pela modelagem, objetiva-se primeiramente, nesta dissertação, apresentar uma metodologia de identificação de modelos não-lineares com estrutura NARX, a partir da implementação de algoritmos combinados de detecção de estrutura e estimação de parâmetros. Inicialmente, será ressaltada a importância da identificação de sistemas na otimização de processos industriais, especificamente a escolha do modelo para representar adequadamente as dinâmicas do sistema. Em seguida, será apresentada uma breve revisão das etapas que compõem a identificação de sistemas. Na sequência, serão apresentados os métodos fundamentais para detecção de estrutura (Modificado Gram- Schmidt) e estimação de parâmetros (Método dos Mínimos Quadrados e Método dos Mínimos Quadrados Estendido) de modelos. No trabalho será também realizada, através dos algoritmos implementados, a identificação de dois processos industriais distintos representados por uma planta de nível didática, que possibilita o controle de nível e vazão, e uma planta de processamento primário de petróleo simulada, que tem como objetivo representar um tratamento primário do petróleo que ocorre em plataformas petrolíferas. A dissertação é finalizada com uma avaliação dos desempenhos dos modelos obtidos, quando comparados com o sistema. A partir desta avaliação, será possível observar se os modelos identificados são capazes de representar as características estáticas e dinâmicas dos sistemas apresentados nesta dissertação
Resumo:
A type of macro drainage solution widely used in urban areas with predomi-nance of closed catchments (basins without outlet) is the implementation of detention and infiltration reservoirs (DIR). This type of solution has the main function of storing surface runoff and to promote soil infiltration and, consequently, aquifer recharge. The practice is to avoid floods in the drainage basin low-lying areas. The catchment waterproofing reduces the distributed groundwater recharge in urban areas, as is the case of Natal city, RN. However, the advantage of DIR is to concentrate the runoff and to promote aquifer recharge to an amount that can surpass the distributed natu-ral recharge. In this paper, we proposed studying a small urban drainage catchment, named Experimental Mirassol Watershed (EMW) in Natal, RN, whose outlet is a DIR. The rainfall-runoff transformation processes, water accumulation in DIR and the pro-cess of infiltration and percolation in the soil profile until the free aquifer were mod-eled and, from rainfall event observations, water levels in DIR and free aquifer water level measurements, and also, parameter values determination, it is was enabled to calibrate and modeling these combined processes. The mathematical modeling was carried out from two numerical models. We used the rainfall-runoff model developed by RIGHETTO (2014), and besides, we developed a one-dimensional model to simu-late the soil infiltration, percolation, redistribution soil water and groundwater in a combined system to the reservoir water balance. Continuous simulation was run over a period of eighteen months in time intervals of one minute. The drainage basin was discretized in blocks units as well as street reaches and the soil profile in vertical cells of 2 cm deep to a total depth of 30 m. The generated hydrographs were transformed into inlet volumes to the DIR and then, it was carried out water balance in these time intervals, considering infiltration and percolation of water in the soil profile. As a re-sult, we get to evaluate the storage water process in DIR as well as the infiltration of water, redistribution into the soil and the groundwater aquifer recharge, in continuous temporal simulation. We found that the DIR has good performance to storage excess water drainage and to contribute to the local aquifer recharge process (Aquifer Dunas / Barreiras).
Resumo:
The main hypothesis of this thesis is that the deve lopment of industrial automation applications efficiently, you need a good structuri ng of data to be handled. Then, with the aim of structuring knowledge involved in the contex t of industrial processes, this thesis proposes an ontology called OntoAuto that conceptua lly models the elements involved in the description of industrial processes. To validat e the proposed ontology, several applica- tions are presented. In the first, two typical indu strial processes are modeled conceptually: treatment unit DEA (Diethanolamine) and kiln. In th e second application, the ontology is used to perform a semantic filtering alarms, which together with the analysis of correla- tions, provides temporal relationships between alar ms from an industrial process. In the third application, the ontology was used for modeli ng and analysis of construction cost and operation processes. In the fourth application, the ontology is adopted to analyze the reliability and availability of an industrial plant . Both for the application as it involves costs for the area of reliability, it was necessary to create new ontologies, and OntoE- con OntoConf, respectivamentem, importing the knowl edge represented in OntoAuto but adding specific information. The main conclusions of the thesis has been that on tology approaches are well suited for structuring the knowledge of industrial process es and based on them, you can develop various advanced applications in industrial automat ion.
Resumo:
Deep bed filtration occurs in several industrial and environmental processes like water filtration and soil contamination. In petroleum industry, deep bed filtration occurs near to injection wells during water injection, causing injectivity reduction. It also takes place during well drilling, sand production control, produced water disposal in aquifers, etc. The particle capture in porous media can be caused by different physical mechanisms (size exclusion, electrical forces, bridging, gravity, etc). A statistical model for filtration in porous media is proposed and analytical solutions for suspended and retained particles are derived. The model, which incorporates particle retention probability, is compared with the classical deep bed filtration model allowing a physical interpretation of the filtration coefficients. Comparison of the obtained analytical solutions for the proposed model with the classical model solutions allows concluding that the larger the particle capture probability, the larger the discrepancy between the proposed and the classical models
Resumo:
In the last decades, the oil, gas and petrochemical industries have registered a series of huge accidents. Influenced by this context, companies have felt the necessity of engaging themselves in processes to protect the external environment, which can be understood as an ecological concern. In the particular case of the nuclear industry, sustainable education and training, which depend too much on the quality and applicability of the knowledge base, have been considered key points on the safely application of this energy source. As a consequence, this research was motivated by the use of the ontology concept as a tool to improve the knowledge management in a refinery, through the representation of a fuel gas sweetening plant, mixing many pieces of information associated with its normal operation mode. In terms of methodology, this research can be classified as an applied and descriptive research, where many pieces of information were analysed, classified and interpreted to create the ontology of a real plant. The DEA plant modeling was performed according to its process flow diagram, piping and instrumentation diagrams, descriptive documents of its normal operation mode, and the list of all the alarms associated to the instruments, which were complemented by a non-structured interview with a specialist in that plant operation. The ontology was verified by comparing its descriptive diagrams with the original plant documents and discussing with other members of the researchers group. All the concepts applied in this research can be expanded to represent other plants in the same refinery or even in other kind of industry. An ontology can be considered a knowledge base that, because of its formal representation nature, can be applied as one of the elements to develop tools to navigate through the plant, simulate its behavior, diagnose faults, among other possibilities
Resumo:
In this dissertation, the theoretical principles governing the molecular modeling were applied for electronic characterization of oligopeptide α3 and its variants (5Q, 7Q)-α3, as well as in the quantum description of the interaction of the aminoglycoside hygromycin B and the 30S subunit of bacterial ribosome. In the first study, the linear and neutral dipeptides which make up the mentioned oligopeptides were modeled and then optimized for a structure of lower potential energy and appropriate dihedral angles. In this case, three subsequent geometric optimization processes, based on classical Newtonian theory, the semi-empirical and density functional theory (DFT), explore the energy landscape of each dipeptide during the search of ideal minimum energy structures. Finally, great conformers were described about its electrostatic potential, ionization energy (amino acids), and frontier molecular orbitals and hopping term. From the hopping terms described in this study, it was possible in subsequent studies to characterize the charge transport propertie of these peptides models. It envisioned a new biosensor technology capable of diagnosing amyloid diseases, related to an accumulation of misshapen proteins, based on the conductivity displayed by proteins of the patient. In a second step of this dissertation, a study carried out by quantum molecular modeling of the interaction energy of an antibiotic ribosomal aminoglicosídico on your receiver. It is known that the hygromycin B (hygB) is an aminoglycoside antibiotic that affects ribosomal translocation by direct interaction with the small subunit of the bacterial ribosome (30S), specifically with nucleotides in helix 44 of the 16S ribosomal RNA (16S rRNA). Due to strong electrostatic character of this connection, it was proposed an energetic investigation of the binding mechanism of this complex using different values of dielectric constants (ε = 0, 4, 10, 20 and 40), which have been widely used to study the electrostatic properties of biomolecules. For this, increasing radii centered on the hygB centroid were measured from the 30S-hygB crystal structure (1HNZ.pdb), and only the individual interaction energy of each enclosed nucleotide was determined for quantum calculations using molecular fractionation with conjugate caps (MFCC) strategy. It was noticed that the dielectric constants underestimated the energies of individual interactions, allowing the convergence state is achieved quickly. But only for ε = 40, the total binding energy of drug-receptor interaction is stabilized at r = 18A, which provided an appropriate binding pocket because it encompassed the main residues that interact more strongly with the hygB - C1403, C1404, G1405, A1493, G1494, U1495, U1498 and C1496. Thus, the dielectric constant ≈ 40 is ideal for the treatment of systems with many electrical charges. By comparing the individual binding energies of 16S rRNA nucleotides with the experimental tests that determine the minimum inhibitory concentration (MIC) of hygB, it is believed that those residues with high binding values generated bacterial resistance to the drug when mutated. With the same reasoning, since those with low interaction energy do not influence effectively the affinity of the hygB in its binding site, there is no loss of effectiveness if they were replaced.
Resumo:
A numerical study on the behavior of tied-back retaining walls in sand, using the finite element method (FEM) is presented. The analyses were performed using the software Plaxis 2D, and were focused on the development of horizontal displacements, horizontal stresses, shear forces and bending moments in the structure during the construction process. Emphasis was placed on the evaluation of wall embedment, tie-back horizontal spacing, wall thickness, and free anchor length on wall behavior. A representative soil profile of a specific region at the City of Natal, Brazil, was used in the numerical analyses. New facilities built on this region often include retaining structures of the same type studied herein. Soil behavior was modeled using the Mohr-Coulomb constitutive model, whereas the structural elements were modeled using the linear elastic model. Shear strength parameters of the soil layers were obtained from direct shear test results conducted with samples collected at the studied site. Deformation parameters were obtained from empirical correlations from SPT test results carried out on the studied site. The results of the numerical analyses revealed that the effect of wall embedment on the investigated parameters is virtually negligible. Conversely, the tie-back horizontal spacing plays an important role on the investigated parameters. The results also demonstrated that the wall thickness significantly affects the wall horizontal displacements, and the shear forces and bending moments within the retaining structure. However, wall thickness was not found to influence horizontal stresses in the structure
Resumo:
The main goal of this dissertation is to develop a Multi Criteria Decision Aid Model to be used in Oils and Gas perforation rigs contracts choices. The developed model should permit the utilization of multiples criterions, covering problems that exist with models that mainly use the price of the contracts as its decision criterion. The AHP has been chosen because its large utilization, not only academic, but in many other areas, its simplicity of use and flexibility, and also fill all the requirements necessary to complete the task. The development of the model was conducted by interviews and surveys with one specialist in this specific area, who also acts as the main actor on the decision process. The final model consists in six criterions: Costs, mobility, automation, technical support, how fast the service could be concluded and availability to start the operations. Three rigs were chosen as possible solutions for the problem. The results reached by the utilizations of the model suggests that the utilization of AHP as a decision support system in this kind of situation is possible, allowing a simplifications of the problem, and also it s a useful tool to improve every one involved on the process s knowledge about the problem subject, and its possible solutions
Resumo:
Due of industrial informatics several attempts have been done to develop notations and semantics, which are used for classifying and describing different kind of system behavior, particularly in the modeling phase. Such attempts provide the infrastructure to resolve some real problems of engineering and construct practical systems that aim at, mainly, to increase the productivity, quality, and security of the process. Despite the many studies that have attempted to develop friendly methods for industrial controller programming, they are still programmed by conventional trial-and-error methods and, in practice, there is little written documentation on these systems. The ideal solution would be to use a computational environment that allows industrial engineers to implement the system using high-level language and that follows international standards. Accordingly, this work proposes a methodology for plant and control modelling of the discrete event systems that include sequential, parallel and timed operations, using a formalism based on Statecharts, denominated Basic Statechart (BSC). The methodology also permits automatic procedures to validate and implement these systems. To validate our methodology, we presented two case studies with typical examples of the manufacturing sector. The first example shows a sequential control for a tagged machine, which is used to illustrated dependences between the devices of the plant. In the second example, we discuss more than one strategy for controlling a manufacturing cell. The model with no control has 72 states (distinct configurations) and, the model with sequential control generated 20 different states, but they only act in 8 distinct configurations. The model with parallel control generated 210 different states, but these 210 configurations act only in 26 distinct configurations, therefore, one strategy control less restrictive than previous. Lastly, we presented one example for highlight the modular characteristic of our methodology, which it is very important to maintenance of applications. In this example, the sensors for identifying pieces in the plant were removed. So, changes in the control model are needed to transmit the information of the input buffer sensor to the others positions of the cell
Resumo:
The control, automation and optimization areas help to improve the processes used by industry. They contribute to a fast production line, improving the products quality and reducing the manufacturing costs. Didatic plants are good tools for research in these areas, providing a direct contact with some industrial equipaments. Given these capabilities, the main goal of this work is to model and control a didactic plant, which is a level and flow process control system with an industrial instrumentation. With a model it is possible to build a simulator for the plant that allows studies about its behaviour, without any of the real processes operational costs, like experiments with controllers. They can be tested several times before its application in a real process. Among the several types of controllers, it was used adaptive controllers, mainly the Direct Self-Tuning Regulators (DSTR) with Integral Action and the Gain Scheduling (GS). The DSTR was based on Pole-Placement design and use the Recursive Least Square to calculate the controller parameters. The characteristics of an adaptive system was very worth to guarantee a good performance when the controller was applied to the plant
Resumo:
Operating industrial processes is becoming more complex each day, and one of the factors that contribute to this growth in complexity is the integration of new technologies and smart solutions employed in the industry, such as the decision support systems. In this regard, this dissertation aims to develop a decision support system based on an computational tool called expert system. The main goal is to turn operation more reliable and secure while maximizing the amount of relevant information to each situation by using an expert system based on rules designed for a particular area of expertise. For the modeling of such rules has been proposed a high-level environment, which allows the creation and manipulation of rules in an easier way through visual programming. Despite its wide range of possible applications, this dissertation focuses only in the context of real-time filtering of alarms during the operation, properly validated in a case study based on a real scenario occurred in an industrial plant of an oil and gas refinery
Resumo:
There is a growing need to develop new tools to help end users in tasks related to the design, monitoring, maintenance and commissioning of critical infrastructures. The complexity of the industrial environment, for example, requires that these tools have flexible features in order to provide valuable data for the designers at the design phases. Furthermore, it is known that industrial processes have stringent requirements for dependability, since failures can cause economic losses, environmental damages and danger to people. The lack of tools that enable the evaluation of faults in critical infrastructures could mitigate these problems. Accordingly, the said work presents developing a framework for analyzing of dependability for critical infrastructures. The proposal allows the modeling of critical infrastructure, mapping its components to a Fault Tree. Then the mathematical model generated is used for dependability analysis of infrastructure, relying on the equipment and its interconnections failures. Finally, typical scenarios of industrial environments are used to validate the proposal
Resumo:
On this research we investigated how new technologies can help the process of design and manufacturing of furniture in such small manufacturers in Rio Grande do Norte state. Google SketchUp, a 3D software tool, was developed in such a way that its internal structures are opened and can be accessed using SketchUp s API for Ruby and programs written in Ruby language (plugins). Using the concepts of the so-called Group Technology and the flexibility that enables adding new functionalities to this software, it was created a Methodology for Modeling of Furniture, a Coding System and a plugin for Google s tool in order to implement the Methodology developed. As resulted, the following facilities are available: the user may create and reuse the library s models over-and-over; reports of the materials manufacturing process costs are provided and, finally, detailed drawings, getting a better integration between the furniture design and manufacturing process
Resumo:
The nonionic surfactants are composed of substances whose molecules in solution, does not ionize. The solubility of these surfactants in water due to the presence of functional groups that have strong affinity for water. When these surfactants are heated is the formation of two liquid phases, evidenced by the phenomenon of turbidity. This study was aimed to determine the experimental temperature and turbidity nonilfenolpoliethoxyled subsequently perform a thermodynamic modeling, considering the models of Flory-Huggins and the empirical solid-liquid equilibrium (SLE). The method used for determining the turbidity point was the visual method (Inoue et al., 2008). The experimental methodology consisted of preparing synthetic solutions of 0,25%, 0,5%, 1%, 2%, 3%, 4%, 5%, 6%, 7%, 8%, 9%, 10%, 12,5%, 15%, 17% and 20% by weight of surfactant. The nonionic surfactants used according to their degree of ethoxylation (9.5, 10, 11, 12 and 13). During the experiments the solutions were homogenized and the bath temperature was gradually increased while the turbidity of the solution temperature was checked visually Inoue et al. (2003). These temperature data of turbidity were used to feed the models evaluated and obtain thermodynamic parameters for systems of surfactants nonilfenolpoliethoxyled. Then the models can be used in phase separation processes, facilitating the extraction of organic solvents, therefore serve as quantitative and qualitative parameters. It was observed that the solidliquid equilibrium model (ESL) was best represented the experimental data.