867 resultados para high performance computing eduction
Resumo:
Realistic virtual models of leaf surfaces are important for a number of applications in the plant sciences, such as modelling agrichemical spray droplet movement and spreading on the surface. In this context, the virtual surfaces are required to be sufficiently smooth to facilitate the use of the mathematical equations that govern the motion of the droplet. While an effective approach is to apply discrete smoothing D2-spline algorithms to reconstruct the leaf surfaces from three-dimensional scanned data, difficulties arise when dealing with wheat leaves that tend to twist and bend. To overcome this topological difficulty, we develop a parameterisation technique that rotates and translates the original data, allowing the surface to be fitted using the discrete smoothing D2-spline methods in the new parameter space. Our algorithm uses finite element methods to represent the surface as a linear combination of compactly supported shape functions. Numerical results confirm that the parameterisation, along with the use of discrete smoothing D2-spline techniques, produces realistic virtual representations of wheat leaves.
Resumo:
This thesis presents an empirical study of the effects of topology on cellular automata rule spaces. The classical definition of a cellular automaton is restricted to that of a regular lattice, often with periodic boundary conditions. This definition is extended to allow for arbitrary topologies. The dynamics of cellular automata within the triangular tessellation were analysed when transformed to 2-manifolds of topological genus 0, genus 1 and genus 2. Cellular automata dynamics were analysed from a statistical mechanics perspective. The sample sizes required to obtain accurate entropy calculations were determined by an entropy error analysis which observed the error in the computed entropy against increasing sample sizes. Each cellular automata rule space was sampled repeatedly and the selected cellular automata were simulated over many thousands of trials for each topology. This resulted in an entropy distribution for each rule space. The computed entropy distributions are indicative of the cellular automata dynamical class distribution. Through the comparison of these dynamical class distributions using the E-statistic, it was identified that such topological changes cause these distributions to alter. This is a significant result which implies that both global structure and local dynamics play a important role in defining long term behaviour of cellular automata.
Resumo:
Firstly, we would like to thank Ms. Alison Brough and her colleagues for their positive commentary on our published work [1] and their appraisal of our utility of the “off-set plane” protocol for anthropometric analysis. The standardized protocols described in our manuscript have wide applications, ranging from forensic anthropology and paleodemographic research to clinical settings such as paediatric practice and orthopaedic surgical design. We affirm that the use of geometrically based reference tools commonly found in computer aided design (CAD) programs such as Geomagic Design X® are imperative for more automated and precise measurement protocols for quantitative skeletal analysis. Therefore we stand by our recommendation of the use of software such as Amira and Geomagic Design X® in the contexts described in our manuscript...
Resumo:
This paper details the initial design and planning of a Field Programmable Gate Array (FPGA) implemented control system that will enable a path planner to interact with a MAVLink based flight computer. The design is aimed at small Unmanned Aircraft Vehicles (UAV) under autonomous operation which are typically subject to constraints arising from limited on-board processing capabilities, power and size. An FPGA implementation for the de- sign is chosen for its potential to address such limitations through low power and high speed in-hardware computation. The MAVLink protocol offers a low bandwidth interface for the FPGA implemented path planner to communicate with an on-board flight computer. A control system plan is presented that is capable of accepting a string of GPS waypoints generated on-board from a previously developed in- hardware Genetic Algorithm (GA) path planner and feeding them to the open source PX4 autopilot, while simultaneously respond- ing with flight status information.
Resumo:
Intensity Modulated Radiotherapy (IMRT) is a well established technique for delivering highly conformal radiation dose distributions. The complexity of the delivery techniques and high dose gradients around the target volume make verification of the patient treatment crucial to the success of the treatment. Conventional treatment protocols involve imaging the patient prior to treatment, comparing the patient set-up to the planned set-up and then making any necessary shifts in the patient position to ensure target volume coverage. This paper presents a method for calibrating electronic portal imaging device (EPID) images acquired during IMRT delivery so that they can be used for verifying the patient set-up.
Resumo:
Skills in spatial sciences are fundamental to understanding our world in context. Increasing digital presence and the availability of data with accurate spatial components has allowed almost everything researchers and students do to be represented in a spatial context. Representing outcomes and disseminating information has moved from 2D to 4D with time series animation. In the next 5 years industry will not only demand QUT graduates have spatial skills along with analytical skills, graduates will be required to present their findings in spatial visualizations that show spatial, spectral and temporal contexts. Domains such as engineering and science will no longer be the leaders in spatial skills as social sciences, health, arts and the business community gain momentum from place-based research including human interactions. A university that can offer students a pathway to advanced spatial investigation will be ahead of the game.
Resumo:
This poster presents key features of how QUT’s integrated research data storage and management services work with researchers through their own individual or team research life cycle. By understanding the characteristics of research data, and the long-term need to store this data, QUT has provided resources and tools that support QUT’s goal of being a research intensive institute. Key to successful delivery and operation has been the focus upon researchers’ individual needs and the collaboration between providers, in particular, Information Technology Services, High Performance Computing and Research Support, and QUT Library. QUT’s Research Data Storage service provides all QUT researchers (staff and Higher Degree Research students (HDRs)) with a secure data repository throughout the research data lifecycle. Three distinct storage areas provide for raw research data to be acquired, project data to be worked on, and published data to be archived. Since the service was launched in late 2014, it has provided research project teams from all QUT faculties with acquisition, working or archival data space. Feedback indicates that the storage suits the unique needs of researchers and their data. As part of the workflow to establish storage space for researchers, Research Support Specialists and Research Data Librarians consult with researchers and HDRs to identify data storage requirements for projects and individual researchers, and to select and implement the most suitable data storage services and facilities. While research can be a journey into the unknown[1], a plan can help navigate through the uncertainty. Intertwined in the storage provision is QUT’s Research Data Management Planning tool. Launched in March 2015, it has already attracted 273 QUT staff and 352 HDR student registrations, and over 620 plans have been created (2/10/2015). Developed in collaboration with Office of Research Ethics and Integrity (OREI), uptake of the plan has exceeded expectations.
Resumo:
A computer code is developed for the numerical prediction of natural convection in rectangular two-dimensional cavities at high Rayleigh numbers. The governing equations are retained in the primitive variable form. The numerical method is based on finite differences and an ADI scheme. Convective terms may be approximated with either central or hybrid differencing for greater stability. A non-uniform grid distribution is possible for greater efficiency. The pressure is dealt with via a SIMPLE type algorithm and the use of a fast elliptic solver for the solenoidal velocity correction field significantly reduces computing times. Preliminary results indicate that the code is reasonably accurate, robust and fast compared with existing benchmarks and finite difference based codes, particularly at high Rayleigh numbers. Extension to three-dimensional problems and turbulence studies in similar geometries is readily possible and indicated.
Resumo:
Numerical predictions are obtained for laminar natural convection of air in a square two dimensional cavity at high Rayleigh numbers. Proper resolution of the core reveals weak multi-cellular structure which varies in a complex manner as the effects of convection are increased. The end of the steady laminar regime is numerically estimated to occur at Ra=2.2x10^8.
Resumo:
A computer program has been developed for the prediction of buoyancy-driven laminar and turbulent flow in rectangular air-filled two-dimensional cavities with differentially heated side walls. Laminar flow predictions for a square cavity and Rayleigh numbers from Ra = 10^3 up to the onset of unsteady flow have been obtained. Accurate solutions for Ra = 5 x 10^6, 10^7, 5 x 10^7 and 10^8 are presented and an estimate for the critical Rayleigh number at which the steady laminar flow becomes unsteady is given for this geometry. Numerical predictions of turbulent flow have been obtained for RaH~0(10^9 -10^11 ) and compared with existing experimental data. A previously developed second moment closure model (Behnia et al. 1987) has been used to model the turbulence. Results indicate that a second moment closure model is capable of predicting the observed flow features.
Resumo:
"This chapter discusses laminar and turbulent natural convection in rectangular cavities. Natural convection in rectangular two-dimensional cavities has become a standard problem in numerical heat transfer because of its relevance in understanding a number of problems in engineering. Current research identified a number of difficulties with regard to the numerical methods and the turbulence modeling for this class of flows. Obtaining numerical predictions at high Rayleigh numbers proved computationally expensive such that results beyond Ra ∼ 1014 are rarely reported. The chapter discusses a study in which it was found that turbulent computations in square cavities can't be extended beyond Ra ∼ O (1012) despite having developed a code that proved very efficient for the high Ra laminar regime. As the Rayleigh number increased, thin boundary layers began to form next to the vertical walls, and the central region became progressively more stagnant and highly stratified. Results obtained for the high Ra laminar regime were in good agreement with existing studies. Turbulence computations, although of a preliminary nature, indicated that a second moment closure model was capable of predicting the experimentally observed flow features."--Publisher Summary
Resumo:
We discuss three approaches to the use of technology as a teaching and learning tool that we are currently implementing for a target group of about one hundred second level engineering mathematics students. Central to these approaches is the underlying theme of motivating relatively poorly motivated students to learn, with the aim of improving learning outcomes. The approaches to be discussed have been used to replace, in part, more traditional mathematics tutorial sessions and lecture presentations. In brief, the first approach involves the application of constructivist thinking in the tertiary education arena, using technology as a computational and visual tool to create motivational knowledge conflicts or crises. The central idea is to model a realistic process of how scientific theory is actually developed, as proposed by Kuhn (1962), in contrast to more standard lecture and tutorial presentations. The second approach involves replacing procedural or algorithmic pencil-and-paper skills-consolidation exercises by software based tasks. Finally, the third approach aims at creating opportunities for higher order thinking via "on-line" exploratory or discovery mode tasks. The latter incorporates the incubation period method, as originally discussed by Rubinstein (1975) and others.
Resumo:
Fan forced injection of phosphine gas fumigant into stored grain is a common method to treat infestation by insects. For low injection velocities the transport of fumigant can be modelled as Darcy flow in a porous medium where the gas pressure satisfies Laplace's equation. Using this approach, a closed form series solution is derived for the pressure, velocity and streamlines in a cylindrically stored grain bed with either a circular or annular inlet, from which traverse times are numerically computed. A leading order closed form expression for the traverse time is also obtained and found to be reasonable for inlet configurations close to the central axis of the grain storage. Results are interpreted for the case of a representative 6m high farm wheat store, where the time to advect the phosphine to almost the entire grain bed is found to be approximately one hour.
Resumo:
The phosphine distribution in a cylindrical silo containing grain is predicted. A three-dimensional mathematical model, which accounts for multicomponent gas phase transport and the sorption of phosphine into the grain kernel is developed. In addition, a simple model is presented to describe the death of insects within the grain as a function of their exposure to phosphine gas. The proposed model is solved using the commercially available computational fluid dynamics (CFD) software, FLUENT, together with our own C code to customize the solver in order to incorporate the models for sorption and insect extinction. Two types of fumigation delivery are studied, namely, fan- forced from the base of the silo and tablet from the top of the silo. An analysis of the predicted phosphine distribution shows that during fan forced fumigation, the position of the leaky area is very important to the development of the gas flow field and the phosphine distribution in the silo. If the leak is in the lower section of the silo, insects that exist near the top of the silo may not be eradicated. However, the position of a leak does not affect phosphine distribution during tablet fumigation. For such fumigation in a typical silo configuration, phosphine concentrations remain low near the base of the silo. Furthermore, we find that half-life pressure test readings are not an indicator of phosphine distribution during tablet fumigation.
Resumo:
In a very recent study [1] the Renormalisation Group (RNG) turbulence model was used to obtain flow predictions in a strongly swirling quarl burner, and was found to perform well in predicting certain features that are not well captured using less sophisticated models of turbulence. The implication is that the RNG approach should provide an economical and reliable tool for the prediction of swirling flows in combustor and furnace geometries commonly encountered in technological applications. To test this hypothesis the present work considers flow in a model furnace for which experimental data is available [2]. The essential features of the flow which differentiate it from the previous study [1] are that the annular air jet entry is relatively narrow and the base wall of the cylindrical furnace is at 90 degrees to the inlet pipe. For swirl numbers of order 1 the resulting flow is highly complex with significant inner and outer recirculation regions. The RNG and standard k-epsilon models are used to model the flow for both swirling and non-swirling entry jets and the results compared with experimental data [2]. Near wall viscous effects are accounted for in both models via the standard wall function formulation [3]. For the RNG model, additional computations with grid placement extending well inside the near wall viscous-affected sublayer are performed in order to assess the low Reynolds number capabilities of the model.