22 resultados para Fluid dynamics -- Data processing

em University of Queensland eSpace - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Simplicity in design and minimal floor space requirements render the hydrocyclone the preferred classifier in mineral processing plants. Empirical models have been developed for design and process optimisation but due to the complexity of the flow behaviour in the hydrocyclone these do not provide information on the internal separation mechanisms. To study the interaction of design variables, the flow behaviour needs to be considered, especially when modelling the new three-product cyclone. Computational fluid dynamics (CFD) was used to model the three-product cyclone, in particular the influence of the dual vortex finder arrangement on flow behaviour. From experimental work performed on the UG2 platinum ore, significant differences in the classification performance of the three-product cyclone were noticed with variations in the inner vortex finder length. Because of this simulations were performed for a range of inner vortex finder lengths. Simulations were also conducted on a conventional hydrocyclone of the same size to enable a direct comparison of the flow behaviour between the two cyclone designs. Significantly, high velocities were observed for the three-product cyclone with an inner vortex finder extended deep into the conical section of the cyclone. CFD studies revealed that in the three-product cyclone, a cylindrical shaped air-core is observed similar to conventional hydrocyclones. A constant diameter air-core was observed throughout the inner vortex finder length, while no air-core was present in the annulus. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computational fluid dynamics was used to search for the links between the observed pattern of attack seen in a bauxite refinery's heat exchanger headers and the hydrodynamics inside the header. Validation of the computational fluid dynamics results was done by comparing then with flow parameters measured in a 1:5 scale model of the first pass header in the laboratory. Computational fluid dynamics simulations were used to establish hydrodynamic similarity between the 1:5 scale and full scale models of the first pass header. It was found that the erosion-corrosion damage seen at the tubesheet of the first pass header was a consequence of increased levels of turbulence at the tubesheet caused by a rapidly turning flow. A prismatic flow corrections device introduced in the past helped in rectifying the problem at the tubesheet but exaggerated the erosion-corrosion problem at the first pass header shell. A number of alternative flow correction devices were tested using computational fluid dynamics. Axial ribbing in the first pass header and an inlet flow diffuser have shown the best performance and were recommended for implementation. Computational fluid dynamics simulations have revealed a smooth orderly low turbulence flow pattern in the second, third and fourth pass as well as the exit headers where no erosion-corrosion was seen in practice. This study has confirmed that near-wall turbulence intensity, which can be successfully predicted by using computational fluid dynamics, is a good hydrodynamic predictor of erosion-corrosion damage in complex geometries. (c) 2006 Published by Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of computational fluid dynamics simulations for calibrating a flush air data system is described, In particular, the flush air data system of the HYFLEX hypersonic vehicle is used as a case study. The HYFLEX air data system consists of nine pressure ports located flush with the vehicle nose surface, connected to onboard pressure transducers, After appropriate processing, surface pressure measurements can he converted into useful air data parameters. The processing algorithm requires an accurate pressure model, which relates air data parameters to the measured pressures. In the past, such pressure models have been calibrated using combinations of flight data, ground-based experimental results, and numerical simulation. We perform a calibration of the HYFLEX flush air data system using computational fluid dynamics simulations exclusively, The simulations are used to build an empirical pressure model that accurately describes the HYFLEX nose pressure distribution ol cr a range of flight conditions. We believe that computational fluid dynamics provides a quick and inexpensive way to calibrate the air data system and is applicable to a broad range of flight conditions, When tested with HYFLEX flight data, the calibrated system is found to work well. It predicts vehicle angle of attack and angle of sideslip to accuracy levels that generally satisfy flight control requirements. Dynamic pressure is predicted to within the resolution of the onboard inertial measurement unit. We find that wind-tunnel experiments and flight data are not necessary to accurately calibrate the HYFLEX flush air data system for hypersonic flight.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional waste stabilisation pond (WSP) models encounter problems predicting pond performance because they cannot account for the influence of pond features, such as inlet structure or pond geometry, on fluid hydrodynamics. In this study, two dimensional (2-D) computational fluid dynamics (CFD) models were compared to experimental residence time distributions (RTD) from literature. In one of the-three geometries simulated, the 2-D CFD model successfully predicted the experimental RTD. However, flow patterns in the other two geometries were not well described due to the difficulty of representing the three dimensional (3-D) experimental inlet in the 2-D CFD model, and the sensitivity of the model results to the assumptions used to characterise the inlet. Neither a velocity similarity nor geometric similarity approach to inlet representation in 2-D gave results correlating with experimental data. However. it was shown that 2-D CFD models were not affected by changes in values of model parameters which are difficult to predict, particularly the turbulent inlet conditions. This work suggests that 2-D CFD models cannot be used a priori to give an adequate description of the hydrodynamic patterns in WSP. (C) 1998 Elsevier Science Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerical simulations of turbulent driven flow in a dense medium cyclone with magnetite medium have been conducted using Fluent. The predicted air core shape and diameter were found to be close to the experimental results measured by gamma ray tomography. It is possible that the Large eddy simulation (LES) turbulence model with Mixture multi-phase model can be used to predict the air/slurry interface accurately although the LES may need a finer grid. Multi-phase simulations (air/water/medium) are showing appropriate medium segregation effects but are over-predicting the level of segregation compared to that measured by gamma-ray tomography in particular with over prediction of medium concentrations near the wall. Further, investigated the accurate prediction of axial segregation of magnetite using the LES turbulence model together with the multi-phase mixture model and viscosity corrections according to the feed particle loading factor. Addition of lift forces and viscosity correction improved the predictions especially near the wall. Predicted density profiles are very close to gamma ray tomography data showing a clear density drop near the wall. The effect of size distribution of the magnetite has been fully studied. It is interesting to note that the ultra-fine magnetite sizes (i.e. 2 and 7 mu m) are distributed uniformly throughout the cyclone. As the size of magnetite increases, more segregation of magnetite occurs close to the wall. The cut-density (d(50)) of the magnetite segregation is 32 gm, which is expected with superfine magnetite feed size distribution. At higher feed densities the agreement between the [Dungilson, 1999; Wood, J.C., 1990. A performance model for coal-washing dense medium cyclones, Ph.D. Thesis, JKMRC, University of Queensland] correlations and the CFD are reasonably good, but the overflow density is lower than the model predictions. It is believed that the excessive underflow volumetric flow rates are responsible for under prediction of the overflow density. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DEM modelling of the motion of coarse fractions of the charge inside SAG mills has now been well established for more than a decade. In these models the effect of slurry has broadly been ignored due to its complexity. Smoothed particle hydrodynamics (SPH) provides a particle based method for modelling complex free surface fluid flows and is well suited to modelling fluid flow in mills. Previous modelling has demonstrated the powerful ability of SPH to capture dynamic fluid flow effects such as lifters crashing into slurry pools, fluid draining from lifters, flow through grates and pulp lifter discharge. However, all these examples were limited by the ability to model only the slurry in the mill without the charge. In this paper, we represent the charge as a dynamic porous media through which the SPH fluid is then able to flow. The porous media properties (specifically the spatial distribution of porosity and velocity) are predicted by time averaging the mill charge predicted using a large scale DEM model. This allows prediction of transient and steady state slurry distributions in the mill and allows its variation with operating parameters, slurry viscosity and slurry volume, to be explored. (C) 2006 Published by Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the finite element simulations of reactive mineral carrying fluids mixing and mineralization in pore-fluid saturated hydrothermal/sedimentary basins. In particular we explore the mixing of reactive sulfide and sulfate fluids and the relevant patterns of mineralization for Load, zinc and iron minerals in the regime of temperature-gradient-driven convective flow. Since the mineralization and ore body formation may last quite a long period of time in a hydrothermal basin, it is commonly assumed that, in the geochemistry, the solutions of minerals are in an equilibrium state or near an equilibrium state. Therefore, the mineralization rate of a particular kind of mineral can be expressed as the product of the pore-fluid velocity and the equilibrium concentration of this particular kind of mineral Using the present mineralization rate of a mineral, the potential of the modern mineralization theory is illustrated by means of finite element studies related to reactive mineral-carrying fluids mixing problems in materially homogeneous and inhomogeneous porous rock basins.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Carbon monoxide is the chief killer in fires. Dangerous levels of CO can occur when reacting combustion gases are quenched by heat transfer, or by mixing of the fire plume in a cooled under- or overventilated upper layer. In this paper, carbon monoxide predictions for enclosure fires are modeled by the conditional moment closure (CMC) method and are compared with laboratory data. The modeled fire situation is a buoyant, turbulent, diffusion flame burning under a hood. The fire plume entrains fresh air, and the postflame gases are cooled considerably under the hood by conduction and radiation, emulating conditions which occur in enclosure fires and lead to the freezing of CO burnout. Predictions of CO in the cooled layer are presented in the context of a complete computational fluid dynamics solution of velocity, temperature, and major species concentrations. A range of underhood equivalence ratios, from rich to lean, are investigated. The CMC method predicts CO in very good agreement with data. In particular, CMC is able to correctly predict CO concentrations in lean cooled gases, showing its capability in conditions where reaction rates change considerably.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The schema of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. Obtaining quickly the appropriate data increases the likelihood that an organization will make good decisions and respond adeptly to challenges. This research presents and validates a methodology for evaluating, ex ante, the relative desirability of alternative instantiations of a model of data. In contrast to prior research, each instantiation is based on a different formal theory. This research theorizes that the instantiation that yields the lowest weighted average query complexity for a representative sample of information requests is the most desirable instantiation for end-user queries. The theory was validated by an experiment that compared end-user performance using an instantiation of a data structure based on the relational model of data with performance using the corresponding instantiation of the data structure based on the object-relational model of data. Complexity was measured using three different Halstead metrics: program length, difficulty, and effort. For a representative sample of queries, the average complexity using each instantiation was calculated. As theorized, end users querying the instantiation with the lower average complexity made fewer semantic errors, i.e., were more effective at composing queries. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A critical assessment is presented for the existing fluid flow models used for dense medium cyclones (DMCs) and hydrocyclones. As the present discussion indicates, the understanding of dense medium cyclone flow is still far from the complete. However, its similarity to the hydrocyclone provides a basis for improved understanding of fluid flow in DMCs. The complexity of fluid flow in DMCs is basically due to the existence of medium as well as the dominance of turbulent particle size and density effects on separation. Both the theoretical and experimental analysis is done with respect to two-phase motions and solid phase flow in hydrocyclones or DMCs. A detailed discussion is presented on the empirical, semiempirical, and the numerical models based upon both the vorticity-stream function approach and Navier-Stokes equations in their primitive variables and in cylindrical coordinates available in literature. The existing equations describing turbulence and multiphase flows in cyclone are also critically reviewed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and purpose Survey data quality is a combination of the representativeness of the sample, the accuracy and precision of measurements, data processing and management with several subcomponents in each. The purpose of this paper is to show how, in the final risk factor surveys of the WHO MONICA Project, information on data quality were obtained, quantified, and used in the analysis. Methods and results In the WHO MONICA (Multinational MONItoring of trends and determinants in CArdiovascular disease) Project, the information about the data quality components was documented in retrospective quality assessment reports. On the basis of the documented information and the survey data, the quality of each data component was assessed and summarized using quality scores. The quality scores were used in sensitivity testing of the results both by excluding populations with low quality scores and by weighting the data by its quality scores. Conclusions Detailed documentation of all survey procedures with standardized protocols, training, and quality control are steps towards optimizing data quality. Quantifying data quality is a further step. Methods used in the WHO MONICA Project could be adopted to improve quality in other health surveys.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although managers consider accurate, timely, and relevant information as critical to the quality of their decisions, evidence of large variations in data quality abounds. Over a period of twelve months, the action research project reported herein attempted to investigate and track data quality initiatives undertaken by the participating organisation. The investigation focused on two types of errors: transaction input errors and processing errors. Whenever the action research initiative identified non-trivial errors, the participating organisation introduced actions to correct the errors and prevent similar errors in the future. Data quality metrics were taken quarterly to measure improvements resulting from the activities undertaken during the action research project. The action research project results indicated that for a mission-critical database to ensure and maintain data quality, commitment to continuous data quality improvement is necessary. Also, communication among all stakeholders is required to ensure common understanding of data quality improvement goals. The action research project found that to further substantially improve data quality, structural changes within the organisation and to the information systems are sometimes necessary. The major goal of the action research study is to increase the level of data quality awareness within all organisations and to motivate them to examine the importance of achieving and maintaining high-quality data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Even when data repositories exhibit near perfect data quality, users may formulate queries that do not correspond to the information requested. Users’ poor information retrieval performance may arise from either problems understanding of the data models that represent the real world systems, or their query skills. This research focuses on users’ understanding of the data structures, i.e., their ability to map the information request and the data model. The Bunge-Wand-Weber ontology was used to formulate three sets of hypotheses. Two laboratory experiments (one using a small data model and one using a larger data model) tested the effect of ontological clarity on users’ performance when undertaking component, record, and aggregate level tasks. The results indicate for the hypotheses associated with different representations but equivalent semantics that parsimonious data model participants performed better for component level tasks but that ontologically clearer data model participants performed better for record and aggregate level tasks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The predictions of nonequilibrium radiation in the shock layer for a Titan aerocapture aeroshell vary significantly amongst Computational Fluid Dynamics (CFD) analyses and are limited by the physical models of the nonequilibrium flow processes. Of particular interest are nonequilibrium processes associated with the CN molecule which is a strong radiator. It is necessary to have experimental data for these radiating shock layers which will allow for validation of the CFD models. This paper describes the development of a test flow condition for subscale aeroshell models in a superorbital expansion tunnel. We discuss the need for a Titan gas condition that closely simulates the atmospheric composition and present experimental data of the free stream test flow conditions. Furthermore, we present finite-rate CFD calculations of the facility to estimate the remaining free stream conditions, which cannot be directly measured during experiments.