971 resultados para Flow process
Resumo:
There is an increasing reliance on computers to solve complex engineering problems. This is because computers, in addition to supporting the development and implementation of adequate and clear models, can especially minimize the financial support required. The ability of computers to perform complex calculations at high speed has enabled the creation of highly complex systems to model real-world phenomena. The complexity of the fluid dynamics problem makes it difficult or impossible to solve equations of an object in a flow exactly. Approximate solutions can be obtained by construction and measurement of prototypes placed in a flow, or by use of a numerical simulation. Since usage of prototypes can be prohibitively time-consuming and expensive, many have turned to simulations to provide insight during the engineering process. In this case the simulation setup and parameters can be altered much more easily than one could with a real-world experiment. The objective of this research work is to develop numerical models for different suspensions (fiber suspensions, blood flow through microvessels and branching geometries, and magnetic fluids), and also fluid flow through porous media. The models will have merit as a scientific tool and will also have practical application in industries. Most of the numerical simulations were done by the commercial software, Fluent, and user defined functions were added to apply a multiscale method and magnetic field. The results from simulation of fiber suspension can elucidate the physics behind the break up of a fiber floc, opening the possibility for developing a meaningful numerical model of the fiber flow. The simulation of blood movement from an arteriole through a venule via a capillary showed that the model based on VOF can successfully predict the deformation and flow of RBCs in an arteriole. Furthermore, the result corresponds to the experimental observation illustrates that the RBC is deformed during the movement. The concluding remarks presented, provide a correct methodology and a mathematical and numerical framework for the simulation of blood flows in branching. Analysis of ferrofluids simulations indicate that the magnetic Soret effect can be even higher than the conventional one and its strength depends on the strength of magnetic field, confirmed experimentally by Völker and Odenbach. It was also shown that when a magnetic field is perpendicular to the temperature gradient, there will be additional increase in the heat transfer compared to the cases where the magnetic field is parallel to the temperature gradient. In addition, the statistical evaluation (Taguchi technique) on magnetic fluids showed that the temperature and initial concentration of the magnetic phase exert the maximum and minimum contribution to the thermodiffusion, respectively. In the simulation of flow through porous media, dimensionless pressure drop was studied at different Reynolds numbers, based on pore permeability and interstitial fluid velocity. The obtained results agreed well with the correlation of Macdonald et al. (1979) for the range of actual flow Reynolds studied. Furthermore, calculated results for the dispersion coefficients in the cylinder geometry were found to be in agreement with those of Seymour and Callaghan.
Resumo:
As a result of the growing interest in studying employee well-being as a complex process that portrays high levels of within-individual variability and evolves over time, this present study considers the experience of flow in the workplace from a nonlinear dynamical systems approach. Our goal is to offer new ways to move the study of employee well-being beyond linear approaches. With nonlinear dynamical systems theory as the backdrop, we conducted a longitudinal study using the experience sampling method and qualitative semi-structured interviews for data collection; 6981 registers of data were collected from a sample of 60 employees. The obtained time series were analyzed using various techniques derived from the nonlinear dynamical systems theory (i.e., recurrence analysis and surrogate data) and multiple correspondence analyses. The results revealed the following: 1) flow in the workplace presents a high degree of within-individual variability; this variability is characterized as chaotic for most of the cases (75%); 2) high levels of flow are associated with chaos; and 3) different dimensions of the flow experience (e.g., merging of action and awareness) as well as individual (e.g., age) and job characteristics (e.g., job tenure) are associated with the emergence of different dynamic patterns (chaotic, linear and random).
Resumo:
Virtually every cell and organ in the human body is dependent on a proper oxygen supply. This is taken care of by the cardiovascular system that supplies tissues with oxygen precisely according to their metabolic needs. Physical exercise is one of the most demanding challenges the human circulatory system can face. During exercise skeletal muscle blood flow can easily increase some 20-fold and its proper distribution to and within muscles is of importance for optimal oxygen delivery. The local regulation of skeletal muscle blood flow during exercise remains little understood, but adenosine and nitric oxide may take part in this process. In addition to acute exercise, long-term vigorous physical conditioning also induces changes in the cardiovasculature, which leads to improved maximal physical performance. The changes are largely central, such as structural and functional changes in the heart. The function and reserve of the heart’s own vasculature can be studied by adenosine infusion, which according to animal studies evokes vasodilation via it’s a2A receptors. This has, however, never been addressed in humans in vivo and also studies in endurance athletes have shown inconsistent results regarding the effects of sport training on myocardial blood flow. This study was performed on healthy young adults and endurance athletes and local skeletal and cardiac muscle blod flow was measured by positron emission tomography. In the heart, myocardial blood flow reserve and adenosine A2A receptor density, and in skeletal muscle, oxygen extraction and consumption was also measured. The role of adenosine in the control of skeletal muscle blood flow during exercise, and its vasodilator effects, were addressed by infusing competitive inhibitors and adenosine into the femoral artery. The formation of skeletal muscle nitric oxide was also inhibited by a drug, with and without prostanoid blockade. As a result and conclusion, it can be said that skeletal muscle blood flow heterogeneity decreases with increasing exercise intensity most likely due to increased vascular unit recruitment, but exercise hyperemia is a very complex phenomenon that cannot be mimicked by pharmacological infusions, and no single regulator factor (e.g. adenosine or nitric oxide) accounts for a significant part of exercise-induced muscle hyperemia. However, in the present study it was observed for the first time in humans that nitric oxide is not only important regulator of the basal level of muscle blood flow, but also oxygen consumption, and together with prostanoids affects muscle blood flow and oxygen consumption during exercise. Finally, even vigorous endurance training does not seem to lead to supranormal myocardial blood flow reserve, and also other receptors than A2A mediate the vasodilator effects of adenosine. In respect to cardiac work, atheletes heart seems to be luxuriously perfused at rest, which may result from reduced oxygen extraction or impaired efficiency due to pronouncedly enhanced myocardial mass developed to excel in strenuous exercise.
Resumo:
This research was motivated by the need to examine the potential application areas of process intensification technologies in Neste Oil Oyj. According to the company’s interest membrane reactor technology was chosen and applicability of this technology in refining industry was investigated. Moreover, Neste Oil suggested a project which is related to the CO2 capture from FCC unit flue gas stream. The flowrate of the flue gas is 180t/h and consist of approximately 14% by volume CO2. Membrane based absorption process (membrane contactor) was chosen as a potential technique to model CO2 capture from fluid catalytic cracking (FCC) unit effluent. In the design of membrane contactor, a mathematical model was developed to describe CO2 absorption from a gas mixture using monoethanole amine (MEA) aqueous solution. According to the results of literature survey, in the hollow fiber contactor for laminar flow conditions approximately 99 % percent of CO2 can be removed by using a 20 cm in length polyvinylidene fluoride (PDVF) membrane. Furthermore, the design of whole process was performed by using PRO/II simulation software and the CO2 removal efficiency of the whole process obtained as 97 %. The technical and economical comparisons among existing MEA absorption processes were performed to determine the advantages and disadvantages of membrane contactor technology.
Resumo:
Determination of the viability of bacteria by the conventional plating technique is a time-consuming process. Methods based on enzyme activity or membrane integrity are much faster and may be good alternatives. Assessment of the viability of suspensions of the plant pathogenic bacterium Clavibacter michiganensis subsp. michiganensis (Cmm) using the fluorescent probes Calcein acetoxy methyl ester (Calcein AM), carboxyfluorescein diacetate (cFDA), and propidium iodide (PI) in combination with flow cytometry was evaluated. Heat-treated and viable (non-treated) Cmm cells labeled with Calcein AM, cFDA, PI, or combinations of Calcein AM and cFDA with PI, could be distinguished based on their fluorescence intensity in flow cytometry analysis. Non-treated cells showed relatively high green fluorescence levels due to staining with either Calcein AM or cFDA, whereas damaged cells (heat-treated) showed high red fluorescence levels due to staining with PI. Flow cytometry also allowed a rapid quantification of viable Cmm cells labeled with Calcein AM or cFDA and heat-treated cells labeled with PI. Therefore, the application of flow cytometry in combination with fluorescent probes appears to be a promising technique for assessing viability of Cmm cells when cells are labeled with Calcein AM or the combination of Calcein AM with PI.
Resumo:
This thesis was made for a large forest industry company’s business segment. The purpose of the study was to improve the performance of the order-to-delivery process of the business segment. The study proceeded in three phases. The first phase was to define customer expectations in the market. The second phase was to analyse the performance and the operations of the order-to-delivery process, and to define any challenges or problems in serving the customers. The third and final phase was improving the performance of the order-to-delivery process, within the scope defined by the first two phases. The analysis showed that the delivery reliability is an essential but a challenging issue in the case company’s markets. On delivery reliability standpoint, the most challenging factors were the detected information flow distortions within the company as well as in the whole supply chain, and the lack of horizontal control over the multi-stage process.
Resumo:
The objective of the work is to study fluid flow behavior through a pinch valve and to estimate the flow coefficient (KV ) at different opening positions of the valve. The flow inside a compressed valve is more complex than in a straight pipe, and it is one of main topics of interest for engineers in process industry. In the present work, we have numerically simulated compressed valve flow at different opening positions. In order to simulate the flow through pinch valve, several models of the elastomeric valve tube (pinch valve tube) at different opening positions were constructed in 2D-axisymmetric and 3D geometries. The numerical simulations were performed with the CFD packages; ANSYS FLUENT and ANSYS CFX by using parallel computing. The distributions of static pressure, velocity and turbulent kinetic energy have been studied at different opening positions of the valve in both 2D-axisymmetric and 3D experiments. The flow coefficient (KV ) values have been measured at different valve openings and are compared between 2D-axisymmetric and 3D simulation results.
Resumo:
This thesis presents a three-dimensional, semi-empirical, steady state model for simulating the combustion, gasification, and formation of emissions in circulating fluidized bed (CFB) processes. In a large-scale CFB furnace, the local feeding of fuel, air, and other input materials, as well as the limited mixing rate of different reactants produce inhomogeneous process conditions. To simulate the real conditions, the furnace should be modelled three-dimensionally or the three-dimensional effects should be taken into account. The only available methods for simulating the large CFB furnaces three-dimensionally are semi-empirical models, which apply a relatively coarse calculation mesh and a combination of fundamental conservation equations, theoretical models and empirical correlations. The number of such models is extremely small. The main objective of this work was to achieve a model which can be applied to calculating industrial scale CFB boilers and which can simulate all the essential sub-phenomena: fluid dynamics, reactions, the attrition of particles, and heat transfer. The core of the work was to develop the model frame and the required sub-models for determining the combustion and sorbent reactions. The objective was reached, and the developed model was successfully used for studying various industrial scale CFB boilers combusting different types of fuel. The model for sorbent reactions, which includes the main reactions for calcitic limestones, was applied for studying the new possible phenomena occurring in the oxygen-fired combustion. The presented combustion and sorbent models and principles can be utilized in other model approaches as well, including other empirical and semi-empirical model approaches, and CFD based simulations. The main achievement is the overall model frame which can be utilized for the further development and testing of new sub-models and theories, and for concentrating the knowledge gathered from the experimental work carried out at bench scale, pilot scale and industrial scale apparatus, and from the computational work performed by other modelling methods.
Resumo:
Electrokinetic remediation coupled with Fenton oxidation, widely called as Electrokinetic Fenton process is a potential soil remediation technique used for low permeable soil. The applicability of the process has been proved with soil contaminated with a wide range of organic compounds from phenol to the most recalcitrant ones such as PAHs and POPs. This thesis summarizes the major findings observed during an Electrokinetic Fenton Process study conducted for the remediation of low permeable soil contaminated with HCB, a typical hydrophobic organic contaminant. Model low permeable soil, kaolin, was artificially contaminated with HCB and subjected to Electrokinetic Fenton treatments in a series of laboratory scale batch experiments. The use of cyclodextrins as an enhancement agent to mobilize the sorbed contaminant through the system was investigated. Major process hindrances such as the oxidant availability and treatment duration were also addressed. The HCB degradation along with other parameters like soil pH, redox and cumulative catholyte flow were analyzed and monitored. The results of the experiments strengthen the existing knowledge on electrokinetic Fenton process as a promising technology for the treatment of soil contaminated with hydrophobic organic compounds. It has been demonstrated that HCB sorbed to kaolin can be degraded by the use of high concentrations of hydrogen peroxide during such processes. The overall system performances were observed to be influenced by the point and mode of oxidant delivery. Furthermore, the study contributes to new knowledge in shortening the treatment duration by adopting an electrode polarity reversal during the process.
Resumo:
The importance of efficient supply chain management has increased due to globalization and the blurring of organizational boundaries. Various supply chain management technologies have been identified to drive organizational profitability and financial performance. Organizations have historically been concentrating heavily on the flow of goods and services, while less attention has been dedicated to the flow of money. While supply chains are becoming more transparent and automated, new opportunities for financial supply chain management have emerged through information technology solutions and comprehensive financial supply chain management strategies. This research concentrates on the end part of the purchasing process which is the handling of invoices. Efficient invoice processing can have an impact on organizations working capital management and thus provide companies with better readiness to face the challenges related to cash management. Leveraging a process mining solution the aim of this research was to examine the automated invoice handling process of four different organizations. The invoice data was collected from each organizations invoice processing system. The sample included all the invoices organizations had processed during the year 2012. The main objective was to find out whether e-invoices are faster to process in an automated invoice processing solution than scanned invoices (post entry into invoice processing solution). Other objectives included looking into the longest lead times between process steps and the impact of manual process steps on cycle time. Processing of invoices from maverick purchases was also examined. Based on the results of the research and previous literature on the subject, suggestions for improving the process were proposed. The results of the research indicate that scanned invoices were processed faster than e-invoices. This is mostly due to the more complex processing of e-invoices. It should be noted however that the manual tasks related to turning a paper invoice into electronic format through scanning are ignored in this research. The transitions with the longest lead times in the invoice handling process included both pre-automated steps as well as manual steps performed by humans. When the most common manual steps were examined in more detail, it was clear that these steps had a prolonging impact on the process. Regarding invoices from maverick purchases the evidence shows that these invoices were slower to process than invoices from purchases conducted through e-procurement systems and from preferred suppliers. Suggestions on how to improve the process included: increasing invoice matching, reducing of manual steps and leveraging of different value added services such as invoice validation service, mobile solutions and supply chain financing services. For companies that have already reaped all the process efficiencies the next step is to engage in collaborative financial supply chain management strategies that can benefit the whole supply chain.
Resumo:
The flow structure of cold and ignited jets issuing into a co-flowing air stream was experimentally studied using a laser Doppler velocimeter. Methane was employed as the jet fluid discharging from circular and elliptic nozzles with aspect ratios varying from 1.29 to 1.60. The diameter of the circular nozzle was 4.6 mm and the elliptic nozzles had approximately the same exit area as that of the circular nozzle. These non-circular nozzles were employed in order to increase the stability of attached jet diffusion flames. The time-averaged velocity and r.m.s. value of the velocity fluctuation in the streamwise and transverse directions were measured over the range of co-flowing stream velocities corresponding to different modes of flame blowout that are identified as either lifted or attached flames. On the basis of these measurements, attempts were made to explain the existence of an apparent optimum aspect ratio for the blowout of attached flames observed at higher values of co-flowing stream velocities. The insensitivity of the blowout limits of lifted flames to nozzle geometry observed in our previous work at low co-flowing stream velocities was also explained. Measurements of the fuel concentration at the jet centerline indicated that the mixing process was enhanced with the 1.38 aspect ratio jet compared with the 1.60 aspect ratio jet. On the basis of the obtained experimental data, it was suggested that the higher blowout limits of attached flames for an elliptic jet of 1.38 aspect ratio was due to higher entrainment rates.
Resumo:
The knowledge of the slug flow characteristics is very important when designing pipelines and process equipment. When the intermittences typical in slug flow occurs, the fluctuations of the flow variables bring additional concern to the designer. Focusing on this subject the present work discloses the experimental data on slug flow characteristics occurring in a large-size, large-scale facility. The results were compared with data provided by mechanistic slug flow models in order to verify their reliability when modelling actual flow conditions. Experiments were done with natural gas and oil or water as the liquid phase. To compute the frequency and velocity of the slug cell and to calculate the length of the elongated bubble and liquid slug one used two pressure transducers measuring the pressure drop across the pipe diameter at different axial locations. A third pressure transducer measured the pressure drop between two axial location 200 m apart. The experimental data were compared with results of Camargo's1 algorithm (1991, 1993), which uses the basics of Dukler & Hubbard's (1975) slug flow model, and those calculated by the transient two-phase flow simulator OLGA.
Resumo:
One of the main problems related to the transport and manipulation of multiphase fluids concerns the existence of characteristic flow patterns and its strong influence on important operation parameters. A good example of this occurs in gas-liquid chemical reactors in which maximum efficiencies can be achieved by maintaining a finely dispersed bubbly flow to maximize the total interfacial area. Thus, the ability to automatically detect flow patterns is of crucial importance, especially for the adequate operation of multiphase systems. This work describes the application of a neural model to process the signals delivered by a direct imaging probe to produce a diagnostic of the corresponding flow pattern. The neural model is constituted of six independent neural modules, each of which trained to detect one of the main horizontal flow patterns, and a last winner-take-all layer responsible for resolving when two or more patterns are simultaneously detected. Experimental signals representing different bubbly, intermittent, annular and stratified flow patterns were used to validate the neural model.
Resumo:
The papermaking industry has been continuously developing intelligent solutions to characterize the raw materials it uses, to control the manufacturing process in a robust way, and to guarantee the desired quality of the end product. Based on the much improved imaging techniques and image-based analysis methods, it has become possible to look inside the manufacturing pipeline and propose more effective alternatives to human expertise. This study is focused on the development of image analyses methods for the pulping process of papermaking. Pulping starts with wood disintegration and forming the fiber suspension that is subsequently bleached, mixed with additives and chemicals, and finally dried and shipped to the papermaking mills. At each stage of the process it is important to analyze the properties of the raw material to guarantee the product quality. In order to evaluate properties of fibers, the main component of the pulp suspension, a framework for fiber characterization based on microscopic images is proposed in this thesis as the first contribution. The framework allows computation of fiber length and curl index correlating well with the ground truth values. The bubble detection method, the second contribution, was developed in order to estimate the gas volume at the delignification stage of the pulping process based on high-resolution in-line imaging. The gas volume was estimated accurately and the solution enabled just-in-time process termination whereas the accurate estimation of bubble size categories still remained challenging. As the third contribution of the study, optical flow computation was studied and the methods were successfully applied to pulp flow velocity estimation based on double-exposed images. Finally, a framework for classifying dirt particles in dried pulp sheets, including the semisynthetic ground truth generation, feature selection, and performance comparison of the state-of-the-art classification techniques, was proposed as the fourth contribution. The framework was successfully tested on the semisynthetic and real-world pulp sheet images. These four contributions assist in developing an integrated factory-level vision-based process control.
Resumo:
Microreactors have proven to be versatile tools for process intensification. Over recent decades, they have increasingly been used for product and process development in chemical industries. Enhanced heat and mass transfer in the reactors due to the extremely high surfacearea- to-volume ratio and interfacial area allow chemical processes to be operated at extreme conditions. Safety is improved by the small holdup volume of the reactors and effective control of pressure and temperature. Hydrogen peroxide is a powerful green oxidant that is used in a wide range of industries. Reduction and auto-oxidation of anthraquinones is currently the main process for hydrogen peroxide production. Direct synthesis is a green alternative and has potential for on-site production. However, there are two limitations: safety concerns because of the explosive gas mixture produced and low selectivity of the process. The aim of this thesis was to develop a process for direct synthesis of hydrogen peroxide utilizing microreactor technology. Experimental and numerical approaches were applied for development of the microreactor. Development of a novel microreactor was commenced by studying the hydrodynamics and mass transfer in prototype microreactor plates. The prototypes were designed and fabricated with the assistance of CFD modeling to optimize the shape and size of the microstructure. Empirical correlations for the mass transfer coefficient were derived. The pressure drop in micro T-mixers was investigated experimentally and numerically. Correlations describing the friction factor for different flow regimes were developed and predicted values were in good agreement with experimental results. Experimental studies were conducted to develop a highly active and selective catalyst with a proper form for the microreactor. Pd catalysts supported on activated carbon cloths were prepared by different treatments during the catalyst preparation. A variety of characterization methods were used for catalyst investigation. The surface chemistry of the support and the oxidation state of the metallic phase in the catalyst play important roles in catalyst activity and selectivity for the direct synthesis. The direct synthesis of hydrogen peroxide was investigated in a bench-scale continuous process using the novel microreactor developed. The microreactor was fabricated based on the hydrodynamic and mass transfer studies and provided a high interfacial area and high mass transfer coefficient. The catalysts were prepared under optimum treatment conditions. The direct synthesis was conducted at various conditions. The thesis represents a step towards a commercially viable direct synthesis. The focus is on the two main challenges: mitigating the safety problem by utilization of microprocess technology and improving the selectivity by catalyst development.