22 resultados para Fluid mechanics - Data processing

em University of Queensland eSpace - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study describes the pedagogical impact of real-world experimental projects undertaken as part of an advanced undergraduate Fluid Mechanics subject at an Australian university. The projects have been organised to complement traditional lectures and introduce students to the challenges of professional design, physical modelling, data collection and analysis. The physical model studies combine experimental, analytical and numerical work in order to develop students’ abilities to tackle real-world problems. A first study illustrates the differences between ideal and real fluid flow force predictions based upon model tests of buildings in a large size wind tunnel used for research and professional testing. A second study introduces the complexity arising from unsteady non-uniform wave loading on a sheltered pile. The teaching initiative is supported by feedback from undergraduate students. The pedagogy of the course and projects is discussed with reference to experiential, project-based and collaborative learning. The practical work complements traditional lectures and tutorials, and provides opportunities which cannot be learnt in the classroom, real or virtual. Student feedback demonstrates a strong interest for the project phases of the course. This was associated with greater motivation for the course, leading in turn to lower failure rates. In terms of learning outcomes, the primary aim is to enable students to deliver a professional report as the final product, where physical model data are compared to ideal-fluid flow calculations and real-fluid flow analyses. Thus the students are exposed to a professional design approach involving a high level of expertise in fluid mechanics, with sufficient academic guidance to achieve carefully defined learning goals, while retaining sufficient flexibility for students to construct there own learning goals. The overall pedagogy is a blend of problem-based and project-based learning, which reflects academic research and professional practice. The assessment is a mix of peer-assessed oral presentations and written reports that aims to maximise student reflection and development. Student feedback indicated a strong motivation for courses that include a well-designed project component.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CFD simulations of the 75 mm, hydrocyclone of Hsieh (1988) have been conducted using Fluent TM. The simulations used 3-dimensional body fitted grids. The simulations were two phase simulations where the air core was resolved using the mixture (Manninen et al., 1996) and VOF (Hirt and Nichols, 1981) models. Velocity predictions from large eddy simulations (LES), using the Smagorinsky-Lilly sub grid scale model (Smagorinsky, 1963; Lilly, 1966) and RANS simulations using the differential Reynolds stress turbulence model (Launder et al., 1975) were compared with Hsieh's experimental velocity data. The LES simulations gave very good agreement with Hsieh's data but required very fine grids to predict the velocities correctly in the bottom of the apex. The DRSM/RANS simulations under predicted tangential velocities, and there was little difference between the velocity predictions using the linear (Launder, 1989) and quadratic (Speziale et al., 1991) pressure strain models. Velocity predictions using the DRSM turbulence model and the linear pressure strain model could be improved by adjusting the pressure strain model constants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OctVCE is a cartesian cell CFD code produced especially for numerical simulations of shock and blast wave interactions with complex geometries. Virtual Cell Embedding (VCE) was chosen as its cartesian cell kernel as it is simple to code and sufficient for practical engineering design problems. This also makes the code much more ‘user-friendly’ than structured grid approaches as the gridding process is done automatically. The CFD methodology relies on a finite-volume formulation of the unsteady Euler equations and is solved using a standard explicit Godonov (MUSCL) scheme. Both octree-based adaptive mesh refinement and shared-memory parallel processing capability have also been incorporated. For further details on the theory behind the code, see the companion report 2007/12.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The schema of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. Obtaining quickly the appropriate data increases the likelihood that an organization will make good decisions and respond adeptly to challenges. This research presents and validates a methodology for evaluating, ex ante, the relative desirability of alternative instantiations of a model of data. In contrast to prior research, each instantiation is based on a different formal theory. This research theorizes that the instantiation that yields the lowest weighted average query complexity for a representative sample of information requests is the most desirable instantiation for end-user queries. The theory was validated by an experiment that compared end-user performance using an instantiation of a data structure based on the relational model of data with performance using the corresponding instantiation of the data structure based on the object-relational model of data. Complexity was measured using three different Halstead metrics: program length, difficulty, and effort. For a representative sample of queries, the average complexity using each instantiation was calculated. As theorized, end users querying the instantiation with the lower average complexity made fewer semantic errors, i.e., were more effective at composing queries. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computational fluid dynamics was used to search for the links between the observed pattern of attack seen in a bauxite refinery's heat exchanger headers and the hydrodynamics inside the header. Validation of the computational fluid dynamics results was done by comparing then with flow parameters measured in a 1:5 scale model of the first pass header in the laboratory. Computational fluid dynamics simulations were used to establish hydrodynamic similarity between the 1:5 scale and full scale models of the first pass header. It was found that the erosion-corrosion damage seen at the tubesheet of the first pass header was a consequence of increased levels of turbulence at the tubesheet caused by a rapidly turning flow. A prismatic flow corrections device introduced in the past helped in rectifying the problem at the tubesheet but exaggerated the erosion-corrosion problem at the first pass header shell. A number of alternative flow correction devices were tested using computational fluid dynamics. Axial ribbing in the first pass header and an inlet flow diffuser have shown the best performance and were recommended for implementation. Computational fluid dynamics simulations have revealed a smooth orderly low turbulence flow pattern in the second, third and fourth pass as well as the exit headers where no erosion-corrosion was seen in practice. This study has confirmed that near-wall turbulence intensity, which can be successfully predicted by using computational fluid dynamics, is a good hydrodynamic predictor of erosion-corrosion damage in complex geometries. (c) 2006 Published by Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and purpose Survey data quality is a combination of the representativeness of the sample, the accuracy and precision of measurements, data processing and management with several subcomponents in each. The purpose of this paper is to show how, in the final risk factor surveys of the WHO MONICA Project, information on data quality were obtained, quantified, and used in the analysis. Methods and results In the WHO MONICA (Multinational MONItoring of trends and determinants in CArdiovascular disease) Project, the information about the data quality components was documented in retrospective quality assessment reports. On the basis of the documented information and the survey data, the quality of each data component was assessed and summarized using quality scores. The quality scores were used in sensitivity testing of the results both by excluding populations with low quality scores and by weighting the data by its quality scores. Conclusions Detailed documentation of all survey procedures with standardized protocols, training, and quality control are steps towards optimizing data quality. Quantifying data quality is a further step. Methods used in the WHO MONICA Project could be adopted to improve quality in other health surveys.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although managers consider accurate, timely, and relevant information as critical to the quality of their decisions, evidence of large variations in data quality abounds. Over a period of twelve months, the action research project reported herein attempted to investigate and track data quality initiatives undertaken by the participating organisation. The investigation focused on two types of errors: transaction input errors and processing errors. Whenever the action research initiative identified non-trivial errors, the participating organisation introduced actions to correct the errors and prevent similar errors in the future. Data quality metrics were taken quarterly to measure improvements resulting from the activities undertaken during the action research project. The action research project results indicated that for a mission-critical database to ensure and maintain data quality, commitment to continuous data quality improvement is necessary. Also, communication among all stakeholders is required to ensure common understanding of data quality improvement goals. The action research project found that to further substantially improve data quality, structural changes within the organisation and to the information systems are sometimes necessary. The major goal of the action research study is to increase the level of data quality awareness within all organisations and to motivate them to examine the importance of achieving and maintaining high-quality data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Even when data repositories exhibit near perfect data quality, users may formulate queries that do not correspond to the information requested. Users’ poor information retrieval performance may arise from either problems understanding of the data models that represent the real world systems, or their query skills. This research focuses on users’ understanding of the data structures, i.e., their ability to map the information request and the data model. The Bunge-Wand-Weber ontology was used to formulate three sets of hypotheses. Two laboratory experiments (one using a small data model and one using a larger data model) tested the effect of ontological clarity on users’ performance when undertaking component, record, and aggregate level tasks. The results indicate for the hypotheses associated with different representations but equivalent semantics that parsimonious data model participants performed better for component level tasks but that ontologically clearer data model participants performed better for record and aggregate level tasks.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OctVCE is a cartesian cell CFD code produced especially for numerical simulations of shock and blast wave interactions with complex geometries, in particular, from explosions. Virtual Cell Embedding (VCE) was chosen as its cartesian cell kernel for its simplicity and sufficiency for practical engineering design problems. The code uses a finite-volume formulation of the unsteady Euler equations with a second order explicit Runge-Kutta Godonov (MUSCL) scheme. Gradients are calculated using a least-squares method with a minmod limiter. Flux solvers used are AUSM, AUSMDV and EFM. No fluid-structure coupling or chemical reactions are allowed, but gas models can be perfect gas and JWL or JWLB for the explosive products. This report also describes the code’s ‘octree’ mesh adaptive capability and point-inclusion query procedures for the VCE geometry engine. Finally, some space will also be devoted to describing code parallelization using the shared-memory OpenMP paradigm. The user manual to the code is to be found in the companion report 2007/13.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Small mesothermal vein quam-gold-base-metal sulfide deposits from which some 20 t of Au-Ag bullion have been extracted, are the most common gold deposits in the Georgetown region of north Queensland-several hundred were mined or prospected between 1870 and 1950. These deposits are mostly hosted by Proterozoic granitic and metamorphic rocks and are similar to the much larger Charters Towers deposits such as Day Dawn and Brilliant, and in some respects to the Motherlode deposits of California. The largest deposit in the region-Kidston (> 138 t of Au and Ag since 1985)- is substantially different. It is hosted by sheeted quartz veins and cavities in brecciated Silurian granite and Proterozoic metamorphics above nested high-level Carboniferous intrusives associated with a nearby cauldron subsidence structure. This paper provides new information (K-Ar and Rb-Sr isotopic ages, preliminary oxygen isotope and fluid-inclusion data) from some of the mesothermal deposits and compares it with the Kidston deposit. All six dated mesothermal deposits have Siluro-Devonian (about 425 to 400 Ma) ages. All nine of such deposits analysed have delta(18)O quartz values in the range 8.4 to 15.7 parts per thousand, Fluid-inclusion data indicate homogenisation temperatures in the range 230-350 degrees C. This information, and a re-interpretation of the spatial relationships of the deposits with various elements of the updated regional geology, is used to develop a preliminary metallogenic model of the mesothermal Etheridge Goldfield. The model indicates how the majority of deposits may have formed from hydrothermal systems initiated during the emplacement of granitic batholiths that were possibly, but not clearly, associated with Early Palaeozoic subduction, and that these fluid systems were dominated by substantially modified meteoric and/or magmatic fluids. The large Kidston deposit and a few small relatives are of Carboniferous age and formed more directly from magmatic systems much closer to the surface.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work a new approach for designing planar gradient coils is outlined for the use in an existing MRI apparatus. A technique that allows for gradient field corrections inside the diameter-sensitive volume is deliberated. These corrections are brought about by making changes to the wire paths that constitute the coil windings, and hence, is called the path correction method. The existing well-known target held method is used to gauge the performance of a typical gradient coil. The gradient coil design methodology is demonstrated for planar openable gradient coils that can be inserted into an existing MRI apparatus. The path corrected gradient coil is compared to the coil obtained using the target field method. It is shown that using a wire path correction with optimized variables, winding patterns that can deliver high magnetic gradient field strengths and large imaging regions can be obtained.