941 resultados para pacs: simulation techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parabolic Trough Concentrators (PTC) are the most proven solar collectors for solar thermal power plants, and are suitable for concentrating photovoltaic (CPV) applications. PV cells are sensitive to spatial uniformity of incident light and the cell operating temperature. This requires the design of CPV-PTCs to be optimised both optically and thermally. Optical modelling can be performed using Monte Carlo Ray Tracing (MCRT), with conjugate heat transfer (CHT) modelling using the computational fluid dynamics (CFD) to analyse the overall designs. This paper develops and evaluates a CHT simulation for a concentrating solar thermal PTC collector. It uses the ray tracing work by Cheng et al. (2010) and thermal performance data for LS-2 parabolic trough used in the SEGS III-VII plants from Dudley et al. (1994). This is a preliminary step to developing models to compare heat transfer performances of faceted absorbers for concentrating photovoltaic (CPV) applications. Reasonable agreement between the simulation results and the experimental data confirms the reliability of the numerical model. The model explores different physical issues as well as computational issues for this particular kind of system modeling. The physical issues include the resultant non-uniformity of the boundary heat flux profile and the temperature profile around the tube, and uneven heating of the HTF. The numerical issues include, most importantly, the design of the computational domain/s, and the solution techniques of the turbulence quantities and the near-wall physics. This simulation confirmed that optical simulation and the computational CHT simulation of the collector can be accomplished independently.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A number of mathematical models investigating certain aspects of the complicated process of wound healing are reported in the literature in recent years. However, effective numerical methods and supporting error analysis for the fractional equations which describe the process of wound healing are still limited. In this paper, we consider the numerical simulation of a fractional mathematical model of epidermal wound healing (FMM-EWH), which is based on the coupled advection-diffusion equations for cell and chemical concentration in a polar coordinate system. The space fractional derivatives are defined in the Left and Right Riemann-Liouville sense. Fractional orders in the advection and diffusion terms belong to the intervals (0, 1) or (1, 2], respectively. Some numerical techniques will be used. Firstly, the coupled advection-diffusion equations are decoupled to a single space fractional advection-diffusion equation in a polar coordinate system. Secondly, we propose a new implicit difference method for simulating this equation by using the equivalent of Riemann-Liouville and Grünwald-Letnikov fractional derivative definitions. Thirdly, its stability and convergence are discussed, respectively. Finally, some numerical results are given to demonstrate the theoretical analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using Monte Carlo simulation for radiotherapy dose calculation can provide more accurate results when compared to the analytical methods usually found in modern treatment planning systems, especially in regions with a high degree of inhomogeneity. These more accurate results acquired using Monte Carlo simulation however, often require orders of magnitude more calculation time so as to attain high precision, thereby reducing its utility within the clinical environment. This work aims to improve the utility of Monte Carlo simulation within the clinical environment by developing techniques which enable faster Monte Carlo simulation of radiotherapy geometries. This is achieved principally through the use new high performance computing environments and simpler alternative, yet equivalent representations of complex geometries. Firstly the use of cloud computing technology and it application to radiotherapy dose calculation is demonstrated. As with other super-computer like environments, the time to complete a simulation decreases as 1=n with increasing n cloud based computers performing the calculation in parallel. Unlike traditional super computer infrastructure however, there is no initial outlay of cost, only modest ongoing usage fees; the simulations described in the following are performed using this cloud computing technology. The definition of geometry within the chosen Monte Carlo simulation environment - Geometry & Tracking 4 (GEANT4) in this case - is also addressed in this work. At the simulation implementation level, a new computer aided design interface is presented for use with GEANT4 enabling direct coupling between manufactured parts and their equivalent in the simulation environment, which is of particular importance when defining linear accelerator treatment head geometry. Further, a new technique for navigating tessellated or meshed geometries is described, allowing for up to 3 orders of magnitude performance improvement with the use of tetrahedral meshes in place of complex triangular surface meshes. The technique has application in the definition of both mechanical parts in a geometry as well as patient geometry. Static patient CT datasets like those found in typical radiotherapy treatment plans are often very large and present a significant performance penalty on a Monte Carlo simulation. By extracting the regions of interest in a radiotherapy treatment plan, and representing them in a mesh based form similar to those used in computer aided design, the above mentioned optimisation techniques can be used so as to reduce the time required to navigation the patient geometry in the simulation environment. Results presented in this work show that these equivalent yet much simplified patient geometry representations enable significant performance improvements over simulations that consider raw CT datasets alone. Furthermore, this mesh based representation allows for direct manipulation of the geometry enabling motion augmentation for time dependant dose calculation for example. Finally, an experimental dosimetry technique is described which allows the validation of time dependant Monte Carlo simulation, like the ones made possible by the afore mentioned patient geometry definition. A bespoke organic plastic scintillator dose rate meter is embedded in a gel dosimeter thereby enabling simultaneous 3D dose distribution and dose rate measurement. This work demonstrates the effectiveness of applying alternative and equivalent geometry definitions to complex geometries for the purposes of Monte Carlo simulation performance improvement. Additionally, these alternative geometry definitions allow for manipulations to be performed on otherwise static and rigid geometry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Custom designed for display on the Cube Installation situated in the new Science and Engineering Centre (SEC) at QUT, the ECOS project is a playful interface that uses real-time weather data to simulate how a five-star energy building operates in climates all over the world. In collaboration with the SEC building managers, the ECOS Project incorporates energy consumption and generation data of the building into an interactive simulation, which is both engaging to users and highly informative, and which invites play and reflection on the roles of green buildings. ECOS focuses on the principle that humans can have both a positive and negative impact on ecosystems with both local and global consequence. The ECOS project draws on the practice of Eco-Visualisation, a term used to encapsulate the important merging of environmental data visualization with the philosophy of sustainability. Holmes (2007) uses the term Eco-Visualisation (EV) to refer to data visualisations that ‘display the real time consumption statistics of key environmental resources for the goal of promoting ecological literacy’. EVs are commonly artifacts of interaction design, information design, interface design and industrial design, but are informed by various intellectual disciplines that have shared interests in sustainability. As a result of surveying a number of projects, Pierce, Odom and Blevis (2008) outline strategies for designing and evaluating effective EVs, including ‘connecting behavior to material impacts of consumption, encouraging playful engagement and exploration with energy, raising public awareness and facilitating discussion, and stimulating critical reflection.’ Consequently, Froehlich (2010) and his colleagues also use the term ‘Eco-feedback technology’ to describe the same field. ‘Green IT’ is another variation which Tomlinson (2010) describes as a ‘field at the juncture of two trends… the growing concern over environmental issues’ and ‘the use of digital tools and techniques for manipulating information.’ The ECOS Project team is guided by these principles, but more importantly, propose an example for how these principles may be achieved. The ECOS Project presents a simplified interface to the very complex domain of thermodynamic and climate modeling. From a mathematical perspective, the simulation can be divided into two models, which interact and compete for balance – the comfort of ECOS’ virtual denizens and the ecological and environmental health of the virtual world. The comfort model is based on the study of psychometrics, and specifically those relating to human comfort. This provides baseline micro-climatic values for what constitutes a comfortable working environment within the QUT SEC buildings. The difference between the ambient outside temperature (as determined by polling the Google Weather API for live weather data) and the internal thermostat of the building (as set by the user) allows us to estimate the energy required to either heat or cool the building. Once the energy requirements can be ascertained, this is then balanced with the ability of the building to produce enough power from green energy sources (solar, wind and gas) to cover its energy requirements. Calculating the relative amount of energy produced by wind and solar can be done by, in the case of solar for example, considering the size of panel and the amount of solar radiation it is receiving at any given time, which in turn can be estimated based on the temperature and conditions returned by the live weather API. Some of these variables can be altered by the user, allowing them to attempt to optimize the health of the building. The variables that can be changed are the budget allocated to green energy sources such as the Solar Panels, Wind Generator and the Air conditioning to control the internal building temperature. These variables influence the energy input and output variables, modeled on the real energy usage statistics drawn from the SEC data provided by the building managers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Plant based dried food products are popular commodities in global market where much research is focused to improve the products and processing techniques. In this regard, numerical modelling is highly applicable and in this work, a coupled meshfree particle-based two-dimensional (2-D) model was developed to simulate micro-scale deformations of plant cells during drying. Smoothed Particle Hydrodynamics (SPH) was used to model the viscous cell protoplasm (cell fluid) by approximating it to an incompressible Newtonian fluid. The visco-elastic characteristic of the cell wall was approximated to a Neo-Hookean solid material augmented with a viscous term and modelled with a Discrete Element Method (DEM). Compared to a previous work [H. C. P. Karunasena, W. Senadeera, Y. T. Gu and R. J. Brown, Appl. Math. Model., 2014], this study proposes three model improvements: linearly decreasing positive cell turgor pressure during drying, cell wall contraction forces and cell wall drying. The improvements made the model more comparable with experimental findings on dried cell morphology and geometric properties such as cell area, diameter, perimeter, roundness, elongation and compactness. This single cell model could be used as a building block for advanced tissue models which are highly applicable for product and process optimizations in Food Engineering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this chapter is to provide an overview of traffic data collection that can and should be used for the calibration and validation of traffic simulation models. There are big differences in availability of data from different sources. Some types of data such as loop detector data are widely available and used. Some can be measured with additional effort, for example, travel time data from GPS probe vehicles. Some types such as trajectory data are available only in rare situations such as research projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using advanced visualization techniques, a comprehensive visualization of all the stages of the self-organized growth of internetworked nanostructures on plasma-exposed surface has been made. Atomistic kinetic Monte Carlo simulation for the initial stage of deposition, with 3-D visualization of the whole system and half-tone visualization of the density field of the adsorbed atoms, makes it possible to implement a multiscale predictive modeling of the development of the nanoscale system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, a variety high-aspect-ratio nanostructures have been grown and profiled for various applications ranging from field emission transistors to gene/drug delivery devices. However, fabricating and processing arrays of these structures and determining how changing certain physical parameters affects the final outcome is quite challenging. We have developed several modules that can be used to simulate the processes of various physical vapour deposition systems from precursor interaction in the gas phase to gas-surface interactions and surface processes. In this paper, multi-scale hybrid numerical simulations are used to study how low-temperature non-equilibrium plasmas can be employed in the processing of high-aspect-ratio structures such that the resulting nanostructures have properties suitable for their eventual device application. We show that whilst using plasma techniques is beneficial in many nanofabrication processes, it is especially useful in making dense arrays of high-aspect-ratio nanostructures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Drying is a key processing techniques used in food engineering which demands continual developments on advanced analysis techniques in order to optimize the product and the process. In this regard, plant based materials are a frequent subject of interest where microstructural studies can provide a clearer understanding on the fundamental physical mechanisms involved. In this context, considering numerous challenges of using conventional numerical grid-based modelling techniques, a meshfree particle based model was developed to simulate extreme deformations of plant microstructure during drying. The proposed technique is based on a particle based meshfree method: Smoothed Particle Hydrodynamics (SPH) and a Discrete Element Method (DEM). A tissue model was developed by aggrading individual cells modelled with SPH-DEM coupled approach by initializing the cells as hexagons and aggregating them to form a tissue. The model also involves a middle lamella resembling real tissues. Using the model, different dried tissue states were simulated with different moisture content, the turgor pressure, and cell wall contraction effects. Compared to the state of the art grid-based microscale plant tissue drying models, the proposed model is capable of simulating plant tissues at lower moisture contents which results in excessive shrinkage and cell wall wrinkling. Model predictions were compared with experimental findings and a fairly good agreement was observed both qualitatively and quantitatively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Determining the condition as well as the remaining life of an insulation system is essential for the reliable operation of large oil-filled power transformers. Frequency-domain spectroscopy (FDS) is one of the diagnostic techniques used to identify the dielectric status of a transformer. Currently, this technique can only be implemented on a de-energized transformer. This paper presents an initial investigation into a novel online monitoring method based on FDS dielectric measurements for transformers. The proposed technique specifically aims to address the real operational constraints of online testing. This is achieved by designing an online testing model extending the basic “extended Debye” linear dielectric model and taking unique noise issues only experienced during online measurements into account via simulations. Approaches to signal denoising and potential problems expected to be encountered during online measurements will also be discussed. Using fixed-frequency sinusoidal excitation waveforms will result in a long measurement times. The use of alternatives such as a chirp has been investigated using simulations. The results presented in the paper predict that reliable measurements should be possible during online testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis developed a high preforming alternative numerical technique to investigate microscale morphological changes of plant food materials during drying. The technique is based on a novel meshfree method, and is more capable of modeling large deformations of multiphase problem domains, when compared with conventional grid-based numerical modeling techniques. The developed cellular model can effectively replicate dried tissue morphological changes such as shrinkage and cell wall wrinkling, as influenced by moisture reduction and turgor loss.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increased focus on energy cost savings and carbon footprint reduction efforts improved the visibility of building energy simulation, which became a mandatory requirement of several building rating systems. Despite developments in building energy simulation algorithms and user interfaces, there are some major challenges associated with building energy simulation; an important one is the computational demands and processing time. In this paper, we analyze the opportunities and challenges associated with this topic while executing a set of 275 parametric energy models simultaneously in EnergyPlus using a High Performance Computing (HPC) cluster. Successful parallel computing implementation of building energy simulations will not only improve the time necessary to get the results and enable scenario development for different design considerations, but also might enable Dynamic-Building Information Modeling (BIM) integration and near real-time decision-making. This paper concludes with the discussions on future directions and opportunities associated with building energy modeling simulations.