882 resultados para Fluid-dynamic analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the present paper, the endogenous theory of time preference is extended to analyze those processes of capital accumulation and changes in environmental quality that are dynamically optimum with respect to the intertemporal preference ordering of the representative individual of the society in question. The analysis is carried out within the conceptual framework of the dynamic analysis of environmental quality, as has been developed by a number of economists for specific cases of the fisheries and forestry commons. The duality principles on intertemporal preference ordering and capital accumulation are extended to the situation where processes of capital accumulation are subject to the Penrose effect, which exhibit the marginal decrease in the effect of investment in private and social overhead capital upon the rate at which capital is accumulated. The dynamically optimum time-path of economic activities is characterized by the proportionality of two systems of imputed, or efficient, prices, one associated with the given intertemporal ordering and another associated with processes of accumulation of private and social overhead capital. It is particularly shown that the dynamically optimality of the processes of capital accumulation involving both private and social overhead capital is characterized by the conditions that are identical with those involving private capital, with the role of social overhead capital only indirectly exhibited.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Reatores tubulares de polimerização podem apresentar um perfil de velocidade bastante distorcido. Partindo desta observação, um modelo estocástico baseado no modelo de dispersão axial foi proposto para a representação matemática da fluidodinâmica de um reator tubular para produção de poliestireno. A equação diferencial foi obtida inserindo a aleatoriedade no parâmetro de dispersão, resultando na adição de um termo estocástico ao modelo capaz de simular as oscilações observadas experimentalmente. A equação diferencial estocástica foi discretizada e resolvida pelo método Euler-Maruyama de forma satisfatória. Uma função estimadora foi desenvolvida para a obtenção do parâmetro do termo estocástico e o parâmetro do termo determinístico foi calculado pelo método dos mínimos quadrados. Uma análise de convergência foi conduzida para determinar o número de elementos da discretização e o modelo foi validado através da comparação de trajetórias e de intervalos de confiança computacionais com dados experimentais. O resultado obtido foi satisfatório, o que auxilia na compreensão do comportamento fluidodinâmico complexo do reator estudado.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Called Symposium on Shock and Vibration in vols. 2-5.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis describes the design and implementation of a new dynamic simulator called DASP. It is a computer program package written in standard Fortran 77 for the dynamic analysis and simulation of chemical plants. Its main uses include the investigation of a plant's response to disturbances, the determination of the optimal ranges and sensitivities of controller settings and the simulation of the startup and shutdown of chemical plants. The design and structure of the program and a number of features incorporated into it combine to make DASP an effective tool for dynamic simulation. It is an equation-oriented dynamic simulator but the model equations describing the user's problem are generated from in-built model equation library. A combination of the structuring of the model subroutines, the concept of a unit module, and the use of the connection matrix of the problem given by the user have been exploited to achieve this objective. The Executive program has a structure similar to that of a CSSL-type simulator. DASP solves a system of differential equations coupled to nonlinear algebraic equations using an advanced mixed equation solver. The strategy used in formulating the model equations makes it possible to obtain the steady state solution of the problem using the same model equations. DASP can handle state and time events in an efficient way and this includes the modification of the flowsheet. DASP is highly portable and this has been demonstrated by running it on a number of computers with only trivial modifications. The program runs on a microcomputer with 640 kByte of memory. It is a semi-interactive program, with the bulk of all input data given in pre-prepared data files with communication with the user is via an interactive terminal. Using the features in-built in the package, the user can view or modify the values of any input data, variables and parameters in the model, and modify the structure of the flowsheet of the problem during a simulation session. The program has been demonstrated and verified using a number of example problems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study presents a computational fluid dynamic (CFD) study of Dimethyl Ether (DME) gas adsorptive separation and steam reforming (DME-SR) in a large scale Circulating Fluidized Bed (CFB) reactor. The CFD model is based on Eulerian-Eulerian dispersed flow and solved using commercial software (ANSYS FLUENT). Hydrogen is currently receiving increasing interest as an alternative source of clean energy and has high potential applications, including the transportation sector and power generation. Computational fluid dynamic (CFD) modelling has attracted considerable recognition in the engineering sector consequently leading to using it as a tool for process design and optimisation in many industrial processes. In most cases, these processes are difficult or expensive to conduct in lab scale experiments. The CFD provides a cost effective methodology to gain detailed information up to the microscopic level. The main objectives in this project are to: (i) develop a predictive model using ANSYS FLUENT (CFD) commercial code to simulate the flow hydrodynamics, mass transfer, reactions and heat transfer in a large scale dual fluidized bed system for combined gas separation and steam reforming processes (ii) implement a suitable adsorption models in the CFD code, through a user defined function, to predict selective separation of a gas from a mixture (iii) develop a model for dimethyl ether steam reforming (DME-SR) to predict hydrogen production (iv) carry out detailed parametric analysis in order to establish ideal operating conditions for future industrial application. The project has originated from a real industrial case problem in collaboration with the industrial partner Dow Corning (UK) and jointly funded by the Engineering and Physical Research Council (UK) and Dow Corning. The research examined gas separation by adsorption in a bubbling bed, as part of a dual fluidized bed system. The adsorption process was simulated based on the kinetics derived from the experimental data produced as part of a separate PhD project completed under the same fund. The kinetic model was incorporated in FLUENT CFD tool as a pseudo-first order rate equation; some of the parameters for the pseudo-first order kinetics were obtained using MATLAB. The modelling of the DME adsorption in the designed bubbling bed was performed for the first time in this project and highlights the novelty in the investigations. The simulation results were analysed to provide understanding of the flow hydrodynamic, reactor design and optimum operating condition for efficient separation. Bubbling bed validation by estimation of bed expansion and the solid and gas distribution from simulation agreed well with trends seen in the literatures. Parametric analysis on the adsorption process demonstrated that increasing fluidizing velocity reduced adsorption of DME. This is as a result of reduction in the gas residence time which appears to have much effect compared to the solid residence time. The removal efficiency of DME from the bed was found to be more than 88%. Simulation of the DME-SR in FLUENT CFD was conducted using selected kinetics from literature and implemented in the model using an in-house developed user defined function. The validation of the kinetics was achieved by simulating a case to replicate an experimental study of a laboratory scale bubbling bed by Vicente et al [1]. Good agreement was achieved for the validation of the models, which was then applied in the DME-SR in the large scale riser section of the dual fluidized bed system. This is the first study to use the selected DME-SR kinetics in a circulating fluidized bed (CFB) system and for the geometry size proposed for the project. As a result, the simulation produced the first detailed data on the spatial variation and final gas product in such an industrial scale fluidized bed system. The simulation results provided insight in the flow hydrodynamic, reactor design and optimum operating condition. The solid and gas distribution in the CFB was observed to show good agreement with literatures. The parametric analysis showed that the increase in temperature and steam to DME molar ratio increased the production of hydrogen due to the increased DME conversions, whereas the increase in the space velocity has been found to have an adverse effect. Increasing temperature between 200 oC to 350 oC increased DME conversion from 47% to 99% while hydrogen yield increased substantially from 11% to 100%. The CO2 selectivity decreased from 100% to 91% due to the water gas shift reaction favouring CO at higher temperatures. The higher conversions observed as the temperature increased was reflected on the quantity of unreacted DME and methanol concentrations in the product gas, where both decreased to very low values of 0.27 mol% and 0.46 mol% respectively at 350 °C. Increasing the steam to DME molar ratio from 4 to 7.68 increased the DME conversion from 69% to 87%, while the hydrogen yield increased from 40% to 59%. The CO2 selectivity decreased from 100% to 97%. The decrease in the space velocity from 37104 ml/g/h to 15394 ml/g/h increased the DME conversion from 87% to 100% while increasing the hydrogen yield from 59% to 87%. The parametric analysis suggests an operating condition for maximum hydrogen yield is in the region of 300 oC temperatures and Steam/DME molar ratio of 5. The analysis of the industrial sponsor’s case for the given flow and composition of the gas to be treated suggests that 88% of DME can be adsorbed from the bubbling and consequently producing 224.4t/y of hydrogen in the riser section of the dual fluidized bed system. The process also produces 1458.4t/y of CO2 and 127.9t/y of CO as part of the product gas. The developed models and parametric analysis carried out in this study provided essential guideline for future design of DME-SR at industrial level and in particular this work has been of tremendous importance for the industrial collaborator in order to draw conclusions and plan for future potential implementation of the process at an industrial scale.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Computational Fluid Dynamics (CFD) has found great acceptance among the engineering community as a tool for research and design of processes that are practically difficult or expensive to study experimentally. One of these processes is the biomass gasification in a Circulating Fluidized Bed (CFB). Biomass gasification is the thermo-chemical conversion of biomass at a high temperature and a controlled oxygen amount into fuel gas, also sometime referred to as syngas. Circulating fluidized bed is a type of reactor in which it is possible to maintain a stable and continuous circulation of solids in a gas-solid system. The main objectives of this thesis are four folds: (i) Develop a three-dimensional predictive model of biomass gasification in a CFB riser using advanced Computational Fluid Dynamic (CFD) (ii) Experimentally validate the developed hydrodynamic model using conventional and advanced measuring techniques (iii) Study the complex hydrodynamics, heat transfer and reaction kinetics through modelling and simulation (iv) Study the CFB gasifier performance through parametric analysis and identify the optimum operating condition to maximize the product gas quality. Two different and complimentary experimental techniques were used to validate the hydrodynamic model, namely pressure measurement and particle tracking. The pressure measurement is a very common and widely used technique in fluidized bed studies, while, particle tracking using PEPT, which was originally developed for medical imaging, is a relatively new technique in the engineering field. It is relatively expensive and only available at few research centres around the world. This study started with a simple poly-dispersed single solid phase then moved to binary solid phases. The single solid phase was used for primary validations and eliminating unnecessary options and steps in building the hydrodynamic model. Then the outcomes from the primary validations were applied to the secondary validations of the binary mixture to avoid time consuming computations. Studies on binary solid mixture hydrodynamics is rarely reported in the literature. In this study the binary solid mixture was modelled and validated using experimental data from the both techniques mentioned above. Good agreement was achieved with the both techniques. According to the general gasification steps the developed model has been separated into three main gasification stages; drying, devolatilization and tar cracking, and partial combustion and gasification. The drying was modelled as a mass transfer from the solid phase to the gas phase. The devolatilization and tar cracking model consist of two steps; the devolatilization of the biomass which is used as a single reaction to generate the biomass gases from the volatile materials and tar cracking. The latter is also modelled as one reaction to generate gases with fixed mass fractions. The first reaction was classified as a heterogeneous reaction while the second reaction was classified as homogenous reaction. The partial combustion and gasification model consisted of carbon combustion reactions and carbon and gas phase reactions. The partial combustion considered was for C, CO, H2 and CH4. The carbon gasification reactions used in this study is the Boudouard reaction with CO2, the reaction with H2O and Methanation (Methane forming reaction) reaction to generate methane. The other gas phase reactions considered in this study are the water gas shift reaction, which is modelled as a reversible reaction and the methane steam reforming reaction. The developed gasification model was validated using different experimental data from the literature and for a wide range of operating conditions. Good agreement was observed, thus confirming the capability of the model in predicting biomass gasification in a CFB to a great accuracy. The developed model has been successfully used to carry out sensitivity and parametric analysis. The sensitivity analysis included: study of the effect of inclusion of various combustion reaction; and the effect of radiation in the gasification reaction. The developed model was also used to carry out parametric analysis by changing the following gasifier operating conditions: fuel/air ratio; biomass flow rates; sand (heat carrier) temperatures; sand flow rates; sand and biomass particle sizes; gasifying agent (pure air or pure steam); pyrolysis models used; steam/biomass ratio. Finally, based on these parametric and sensitivity analysis a final model was recommended for the simulation of biomass gasification in a CFB riser.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The objective of this study was to gain further understanding and elucidation of the fluid dynamic factors and flow-induced mechanisms of the thrombogenic process of platelet deposition onto, and possible subsequent embolization from, the walls of an arterial stenosis. This has been accomplished by measurement of the axial dependence of platelet deposition within a modeled arterial stenosis for a transitional flow and a completely laminar flow field. The stenotic region of the model was collagen-coated to simulate a damaged endothelial lining of an artery. Fluid dynamics within a stenosis was studied using qualitative flow visualization, and was further compared to the in vitro platelet deposition studies. Normalized platelet density (NPD) measurements indicate decreased levels of NPD in the high shear throat region of the stenosis for a Reynolds number of 300 and a drastic increase in NPD at the throat for a Reynolds number of 175. This study provides further understanding of the flow dynamic effects on thrombus development within a stenosis. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research focuses on the design and verification of inter-organizational controls. Instead of looking at a documentary procedure, which is the flow of documents and data among the parties, the research examines the underlying deontic purpose of the procedure, the so-called deontic process, and identifies control requirements to secure this purpose. The vision of the research is a formal theory for streamlining bureaucracy in business and government procedures. ^ Underpinning most inter-organizational procedures are deontic relations, which are about rights and obligations of the parties. When all parties trust each other, they are willing to fulfill their obligations and honor the counter parties’ rights; thus controls may not be needed. The challenge is in cases where trust may not be assumed. In these cases, the parties need to rely on explicit controls to reduce their exposure to the risk of opportunism. However, at present there is no analytic approach or technique to determine which controls are needed for a given contracting or governance situation. ^ The research proposes a formal method for deriving inter-organizational control requirements based on static analysis of deontic relations and dynamic analysis of deontic changes. The formal method will take a deontic process model of an inter-organizational transaction and certain domain knowledge as inputs to automatically generate control requirements that a documentary procedure needs to satisfy in order to limit fraud potentials. The deliverables of the research include a formal representation namely Deontic Petri Nets that combine multiple modal logics and Petri nets for modeling deontic processes, a set of control principles that represent an initial formal theory on the relationships between deontic processes and documentary procedures, and a working prototype that uses model checking technique to identify fraud potentials in a deontic process and generate control requirements to limit them. Fourteen scenarios of two well-known international payment procedures—cash in advance and documentary credit—have been used to test the prototype. The results showed that all control requirements stipulated in these procedures could be derived automatically.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research focuses on the design and verification of inter-organizational controls. Instead of looking at a documentary procedure, which is the flow of documents and data among the parties, the research examines the underlying deontic purpose of the procedure, the so-called deontic process, and identifies control requirements to secure this purpose. The vision of the research is a formal theory for streamlining bureaucracy in business and government procedures. Underpinning most inter-organizational procedures are deontic relations, which are about rights and obligations of the parties. When all parties trust each other, they are willing to fulfill their obligations and honor the counter parties’ rights; thus controls may not be needed. The challenge is in cases where trust may not be assumed. In these cases, the parties need to rely on explicit controls to reduce their exposure to the risk of opportunism. However, at present there is no analytic approach or technique to determine which controls are needed for a given contracting or governance situation. The research proposes a formal method for deriving inter-organizational control requirements based on static analysis of deontic relations and dynamic analysis of deontic changes. The formal method will take a deontic process model of an inter-organizational transaction and certain domain knowledge as inputs to automatically generate control requirements that a documentary procedure needs to satisfy in order to limit fraud potentials. The deliverables of the research include a formal representation namely Deontic Petri Nets that combine multiple modal logics and Petri nets for modeling deontic processes, a set of control principles that represent an initial formal theory on the relationships between deontic processes and documentary procedures, and a working prototype that uses model checking technique to identify fraud potentials in a deontic process and generate control requirements to limit them. Fourteen scenarios of two well-known international payment procedures -- cash in advance and documentary credit -- have been used to test the prototype. The results showed that all control requirements stipulated in these procedures could be derived automatically.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In combination of the advantages of both parallel mechanisms and compliant mechanisms, a compliant parallel mechanism with two rotational DOFs (degrees of freedom) is designed to meet the requirement of a lightweight and compact pan-tilt platform. Firstly, two commonly-used design methods i.e. direct substitution and FACT (Freedom and Constraint Topology) are applied to design the configuration of the pan-tilt system, and similarities and differences of the two design alternatives are compared. Then inverse kinematic analysis of the candidate mechanism is implemented by using the pseudo-rigid-body model (PRBM), and the Jacobian related to its differential kinematics is further derived to help designer realize dynamic analysis of the 8R compliant mechanism. In addition, the mechanism’s maximum stress existing within its workspace is tested by finite element analysis. Finally, a method to determine joint damping of the flexure hinge is presented, which aims at exploring the effect of joint damping on actuator selection and real-time control. To the authors’ knowledge, almost no existing literature concerns with this issue.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Modern software applications are becoming more dependent on database management systems (DBMSs). DBMSs are usually used as black boxes by software developers. For example, Object-Relational Mapping (ORM) is one of the most popular database abstraction approaches that developers use nowadays. Using ORM, objects in Object-Oriented languages are mapped to records in the database, and object manipulations are automatically translated to SQL queries. As a result of such conceptual abstraction, developers do not need deep knowledge of databases; however, all too often this abstraction leads to inefficient and incorrect database access code. Thus, this thesis proposes a series of approaches to improve the performance of database-centric software applications that are implemented using ORM. Our approaches focus on troubleshooting and detecting inefficient (i.e., performance problems) database accesses in the source code, and we rank the detected problems based on their severity. We first conduct an empirical study on the maintenance of ORM code in both open source and industrial applications. We find that ORM performance-related configurations are rarely tuned in practice, and there is a need for tools that can help improve/tune the performance of ORM-based applications. Thus, we propose approaches along two dimensions to help developers improve the performance of ORM-based applications: 1) helping developers write more performant ORM code; and 2) helping developers configure ORM configurations. To provide tooling support to developers, we first propose static analysis approaches to detect performance anti-patterns in the source code. We automatically rank the detected anti-pattern instances according to their performance impacts. Our study finds that by resolving the detected anti-patterns, the application performance can be improved by 34% on average. We then discuss our experience and lessons learned when integrating our anti-pattern detection tool into industrial practice. We hope our experience can help improve the industrial adoption of future research tools. However, as static analysis approaches are prone to false positives and lack runtime information, we also propose dynamic analysis approaches to further help developers improve the performance of their database access code. We propose automated approaches to detect redundant data access anti-patterns in the database access code, and our study finds that resolving such redundant data access anti-patterns can improve application performance by an average of 17%. Finally, we propose an automated approach to tune performance-related ORM configurations using both static and dynamic analysis. Our study shows that our approach can help improve application throughput by 27--138%. Through our case studies on real-world applications, we show that all of our proposed approaches can provide valuable support to developers and help improve application performance significantly.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Este trabalho tem como objetivo desenvolver uma metodologia de seletividade cinética, para os pseudocomponentes do petróleo em escoamento gás-liquido em colunas de bolhas usando a Fluidodinâmica Computacional (CFD). Uma geometria cilíndrica de 2,5m de altura e 0,162m de diâmetro foi usada tanto na validação fluidodinâmica com base em dados experimentais da literatura, como na análise cinética do reator operando em dois modos distintos em relação a fase líquida: batelada e contínuo. Todos os casos de estudo operam em regime heterogêneo de escoamento, com velocidade superficial do gás igual a 8 cm/s e diâmetro médio de bolhas de 6 mm. O modelo fluidodinâmico validado apresentou boa concordância com os dados experimentais, sendo empregado como base para a implementação do modelo cinético de rede de Krishna e Saxena (1989). A análise da hidroconversão foi realizada a 371ºC, e os resultados mostraram o comportamento esperado para o processo reativo estudado, definindo-se os tempos (batelada) e posições axiais (contínuo) de coleta ideal para os pseudocomponentes leves. Em síntese, ressaltase o uso da ferramenta CFD no entendimento, desenvolvimento e otimização de processos.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work proposes a model to investigate the use of a cylindrical antenna used in the thermal method of recovering through electromagnetic radiation of high-viscosity oil. The antenna has a simple geometry, adapted dipole type, and it can be modelled by using Maxwell s equation. The wavelet transforms are used as basis functions and applied in conjunction with the method of moments to obtain the current distribution in the antenna. The electric field, power and temperature distribution are carefully calculated for the analysis of the antenna as electromagnetic heating. The energy performance is analyzed based on thermo-fluid dynamic simulations at field scale, and through the adaptation in the Steam Thermal and Advanced Processes Reservoir Simulator (STARS) by Computer Modelling Group (CMG). The model proposed and the numerical results obtained are stable and presented good agreement with the results reported in the specialized literature