921 resultados para Computational Simulation
Resumo:
Food waste is a current challenge that both developing and developed countries face. This project applied a novel combination of available methods in Mechanical, agricultural and food engineering to address these challenges. A systematic approach was devised to investigate possibilities of reducing food waste and increasing the efficiency of industry by applying engineering concepts and theories including experimental, mathematical and computational modelling methods. This study highlights the impact of comprehensive understanding of agricultural and food material response to the mechanical operations and its direct relation to the volume of food wasted globally.
Resumo:
Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.
Resumo:
Standard Monte Carlo (sMC) simulation models have been widely used in AEC industry research to address system uncertainties. Although the benefits of probabilistic simulation analyses over deterministic methods are well documented, the sMC simulation technique is quite sensitive to the probability distributions of the input variables. This phenomenon becomes highly pronounced when the region of interest within the joint probability distribution (a function of the input variables) is small. In such cases, the standard Monte Carlo approach is often impractical from a computational standpoint. In this paper, a comparative analysis of standard Monte Carlo simulation to Markov Chain Monte Carlo with subset simulation (MCMC/ss) is presented. The MCMC/ss technique constitutes a more complex simulation method (relative to sMC), wherein a structured sampling algorithm is employed in place of completely randomized sampling. Consequently, gains in computational efficiency can be made. The two simulation methods are compared via theoretical case studies.
Resumo:
The wind field of an intense idealised downburst wind storm has been studied using an axisymmetric, dry, non-hydrostatic numerical sub-cloud model. The downburst driving processes of evaporation and melting have been paramaterized by an imposed cooling source that triggers and sustains a downdraft. The simulated downburst exhibits many characteristics of observed full-scale downburst events, in particular the presence of a primary and counter rotating secondary ring vortex at the leading edge of the diverging front. The counter-rotating vortex is shown to significantly influence the development and structure of the outflow. Numerical forcing and environmental characteristics have been systematically varied to determine the influence on the outflow wind field. Normalised wind structure at the time of peak outflow intensity was generally shown to remain constant for all simulations. Enveloped velocity profiles considering the velocity structure throughout the entire storm event show much more scatter. Assessing the available kinetic energy within each simulated storm event, it is shown that the simulated downburst wind events had significantly less energy available for loading isolated structures when compared with atmospheric boundary layer winds. The discrepancy is shown to be particularly prevalent when wind speeds were integrated over heights representative of tall buildings. A similar analysis for available full scale measurements led to similar findings.
Resumo:
A non-translating, long duration thunderstorm downburst has been simulated experimentally and numerically by modelling a spatially stationary steady flow impinging air jet. Velocity profiles were shown to compare well with an upper-bound of velocity measurements reported for full-scale microbursts. Velocity speed-up over a range of topographic features in simulated downburst flow was also tested with comparisons made to previous work in a similar flow, and also boundary layer wind tunnel experiments. It was found that the amplification measured above the crest of topographic features in simulated downburst flow was up to 35% less than that observed in boundary layer flow for all shapes tested. From the computational standpoint we conclude that the Shear Stress Transport (SST) model performs the best from amongst a range of eddy-viscosity and second moment closures tested for modelling the impinging jet flow.
Resumo:
Scaffolds are porous biocompatible materials with suitable microarchitectures that are designed to allow for cell adhesion, growth and proliferation. They are used in combination with cells in regenerative medicine to promote tissue regeneration by means of a controlled deposition of natural extracellular matrix by the hosted cells therein. This healing process is in many cases accompanied by scaffold degradation up to its total disappearance when the scaffold is made of a biodegradable material. This work presents a computational model that simulates the degradation of scaffolds. The model works with three-dimensional microstructures, which have been previously discretised into small cubic homogeneous elements, called voxels. The model simulates the evolution of the degradation of the scaffold using a Monte Carlo algorithm, which takes into account the curvature of the surface of the fibres. The simulation results obtained in this study are in good agreement with empirical degradation measurements performed by mass loss on scaffolds after exposure to an etching alkaline solution.
Resumo:
In this paper, we present fully Bayesian experimental designs for nonlinear mixed effects models, in which we develop simulation-based optimal design methods to search over both continuous and discrete design spaces. Although Bayesian inference has commonly been performed on nonlinear mixed effects models, there is a lack of research into performing Bayesian optimal design for nonlinear mixed effects models that require searches to be performed over several design variables. This is likely due to the fact that it is much more computationally intensive to perform optimal experimental design for nonlinear mixed effects models than it is to perform inference in the Bayesian framework. In this paper, the design problem is to determine the optimal number of subjects and samples per subject, as well as the (near) optimal urine sampling times for a population pharmacokinetic study in horses, so that the population pharmacokinetic parameters can be precisely estimated, subject to cost constraints. The optimal sampling strategies, in terms of the number of subjects and the number of samples per subject, were found to be substantially different between the examples considered in this work, which highlights the fact that the designs are rather problem-dependent and require optimisation using the methods presented in this paper.
Resumo:
Background Sexually-transmitted pathogens often have severe reproductive health implications if treatment is delayed or absent, especially in females. The complex processes of disease progression, namely replication and ascension of the infection through the genital tract, span both extracellular and intracellular physiological scales, and in females can vary over the distinct phases of the menstrual cycle. The complexity of these processes, coupled with the common impossibility of obtaining comprehensive and sequential clinical data from individual human patients, makes mathematical and computational modelling valuable tools in developing our understanding of the infection, with a view to identifying new interventions. While many within-host models of sexually-transmitted infections (STIs) are available in existing literature, these models are difficult to deploy in clinical/experimental settings since simulations often require complex computational approaches. Results We present STI-GMaS (Sexually-Transmitted Infections – Graphical Modelling and Simulation), an environment for simulation of STI models, with a view to stimulating the uptake of these models within the laboratory or clinic. The software currently focuses upon the representative case-study of Chlamydia trachomatis, the most common sexually-transmitted bacterial pathogen of humans. Here, we demonstrate the use of a hybrid PDE–cellular automata model for simulation of a hypothetical Chlamydia vaccination, demonstrating the effect of a vaccine-induced antibody in preventing the infection from ascending to above the cervix. This example illustrates the ease with which existing models can be adapted to describe new studies, and its careful parameterisation within STI-GMaS facilitates future tuning to experimental data as they arise. Conclusions STI-GMaS represents the first software designed explicitly for in-silico simulation of STI models by non-theoreticians, thus presenting a novel route to bridging the gap between computational and clinical/experimental disciplines. With the propensity for model reuse and extension, there is much scope within STI-GMaS to allow clinical and experimental studies to inform model inputs and drive future model development. Many of the modelling paradigms and software design principles deployed to date transfer readily to other STIs, both bacterial and viral; forthcoming releases of STI-GMaS will extend the software to incorporate a more diverse range of infections.
Resumo:
Convectively driven downburst winds pose a threat to structures and communities in many regions of Australia not subject to tropical cyclones. Extreme value analysis shows that for return periods of interest to engineering design these events produce higher gust wind speeds than synoptic scale windstorms. Despite this, comparatively little is known of the near ground wind structure of these potentially hazardous windstorms. With this in mind, a series of idealised three-dimensional numerical simulations were undertaken to investigate convective storm wind fields. A dry, non-hydrostatic, sub-cloud model with parameterisation of the microphysics was used. Simulations were run with a uniform 20 m horizontal grid resolution and a variable vertical resolution increasing from 1 m. A systematic grid resolution study showed further refinement did not alter the morphological structure of the outflow. Simulations were performed for stationary downbursts in a quiescent air field, stationary downbursts embedded within environmental boundary layer winds, and also translating downbursts embedded within environmental boundary layer winds.
Resumo:
A thunderstorm downburst in its simplest form can be modelled as a steady flow impinging air jet. Although this simplification neglects some important atmospheric and physical parameters it has proven to be a useful tool for understanding the kinematics of these events. Assuming this simple impinging jet model also allows numerical models to be developed which can be directly compared with experimental results to validate the use of CFD. Confidence gained from these simulations will allow the use of more complex atmospheric impinging jet models that cannot be directly validated. Thunderstorm downbursts are important for wind engineers because in many parts of the world they produce the design wind speeds used in design standards, but are not structurally represented in these documents.
Resumo:
Recently, a variety high-aspect-ratio nanostructures have been grown and profiled for various applications ranging from field emission transistors to gene/drug delivery devices. However, fabricating and processing arrays of these structures and determining how changing certain physical parameters affects the final outcome is quite challenging. We have developed several modules that can be used to simulate the processes of various physical vapour deposition systems from precursor interaction in the gas phase to gas-surface interactions and surface processes. In this paper, multi-scale hybrid numerical simulations are used to study how low-temperature non-equilibrium plasmas can be employed in the processing of high-aspect-ratio structures such that the resulting nanostructures have properties suitable for their eventual device application. We show that whilst using plasma techniques is beneficial in many nanofabrication processes, it is especially useful in making dense arrays of high-aspect-ratio nanostructures.
Resumo:
This paper addresses of the advanced computational technique of steel structures for both simulation capacities simultaneously; specifically, they are the higher-order element formulation with element load effect (geometric nonlinearities) as well as the refined plastic hinge method (material nonlinearities). This advanced computational technique can capture the real behaviour of a whole second-order inelastic structure, which in turn ensures the structural safety and adequacy of the structure. Therefore, the emphasis of this paper is to advocate that the advanced computational technique can replace the traditional empirical design approach. In the meantime, the practitioner should be educated how to make use of the advanced computational technique on the second-order inelastic design of a structure, as this approach is the future structural engineering design. It means the future engineer should understand the computational technique clearly; realize the behaviour of a structure with respect to the numerical analysis thoroughly; justify the numerical result correctly; especially the fool-proof ultimate finite element is yet to come, of which is competent in modelling behaviour, user-friendly in numerical modelling and versatile for all structural forms and various materials. Hence the high-quality engineer is required, who can confidently manipulate the advanced computational technique for the design of a complex structure but not vice versa.
Resumo:
The results of numerical simulation of the equilibrium parameters of a low pressure nanopowder-generating discharge in silane for the plasma enhanced chemical vapor deposition (PECVD) of nanostructured silicon-based films are presented. It is shown that a low electron temperature and a low density of negative SiH3 - ions are favorable for the PECVD process. This opens a possibility to predict the main parameters of the reactive plasma and plasma-nucleated nanoparticles, and hence, to control the quality of silicon nanofilms.
Resumo:
Bayesian experimental design is a fast growing area of research with many real-world applications. As computational power has increased over the years, so has the development of simulation-based design methods, which involve a number of algorithms, such as Markov chain Monte Carlo, sequential Monte Carlo and approximate Bayes methods, facilitating more complex design problems to be solved. The Bayesian framework provides a unified approach for incorporating prior information and/or uncertainties regarding the statistical model with a utility function which describes the experimental aims. In this paper, we provide a general overview on the concepts involved in Bayesian experimental design, and focus on describing some of the more commonly used Bayesian utility functions and methods for their estimation, as well as a number of algorithms that are used to search over the design space to find the Bayesian optimal design. We also discuss other computational strategies for further research in Bayesian optimal design.