501 resultados para product simulation
Resumo:
In this paper, we present fully Bayesian experimental designs for nonlinear mixed effects models, in which we develop simulation-based optimal design methods to search over both continuous and discrete design spaces. Although Bayesian inference has commonly been performed on nonlinear mixed effects models, there is a lack of research into performing Bayesian optimal design for nonlinear mixed effects models that require searches to be performed over several design variables. This is likely due to the fact that it is much more computationally intensive to perform optimal experimental design for nonlinear mixed effects models than it is to perform inference in the Bayesian framework. In this paper, the design problem is to determine the optimal number of subjects and samples per subject, as well as the (near) optimal urine sampling times for a population pharmacokinetic study in horses, so that the population pharmacokinetic parameters can be precisely estimated, subject to cost constraints. The optimal sampling strategies, in terms of the number of subjects and the number of samples per subject, were found to be substantially different between the examples considered in this work, which highlights the fact that the designs are rather problem-dependent and require optimisation using the methods presented in this paper.
Resumo:
Passenger experience has become a major factor that influences the success of an airport. In this context, passenger flow simulation has been used in designing and managing airports. However, most passenger flow simulations failed to consider the group dynamics when developing passenger flow models. In this paper, an agent-based model is presented to simulate passenger behaviour at the airport check-in and evacuation process. The simulation results show that the passenger behaviour can have significant influences on the performance and utilisation of services in airport terminals. The model was created using AnyLogic software and its parameters were initialised using recent research data published in the literature.
Resumo:
Introduction Natural product provenance is important in the food, beverage and pharmaceutical industries, for consumer confidence and with health implications. Raman spectroscopy has powerful molecular fingerprint abilities. Surface Enhanced Raman Spectroscopy’s (SERS) sharp peaks allow distinction between minimally different molecules, so it should be suitable for this purpose. Methods Naturally caffeinated beverages with Guarana extract, coffee and Red Bull energy drink as a synthetic caffeinated beverage for comparison (20 µL ea.) were reacted 1:1 with Gold nanoparticles functionalised with anti-caffeine antibody (ab15221) (10 minutes), air dried and analysed in a micro-Raman instrument. The spectral data was processed using Principle Component Analysis (PCA). Results The PCA showed Guarana sourced caffeine varied significantly from synthetic caffeine (Red Bull) on component 1 (containing 76.4% of the variance in the data). See figure 1. The coffee containing beverages, and in particular Robert Timms (instant coffee) were very similar on component 1, but the barista espresso showed minor variance on component 1. Both coffee sourced caffeine samples varied with red Bull on component 2, (20% of variance). ************************************************************ Figure 1 PCA comparing a naturally caffeinated beverage containing Guarana with coffee. ************************************************************ Discussion PCA is an unsupervised multivariate statistical method that determines patterns within data. Figure 1 shows Caffeine in Guarana is notably different to synthetic caffeine. Other researchers have revealed that caffeine in Guarana plants is complexed with tannins. Naturally sourced/ lightly processed caffeine (Monster Energy, Espresso) are more inherently different than synthetic (Red Bull) /highly processed (Robert Timms) caffeine, in figure 1, which is consistent with this finding and demonstrates this technique’s applicability. Guarana provenance is important because it is still largely hand produced and its demand is escalating with recognition of its benefits. This could be a powerful technique for Guarana provenance, and may extend to other industries where provenance / authentication are required, e.g. the wine or natural pharmaceuticals industries.
Resumo:
We examine some variations of standard probability designs that preferentially sample sites based on how easy they are to access. Preferential sampling designs deliver unbiased estimates of mean and sampling variance and will ease the burden of data collection but at what cost to our design efficiency? Preferential sampling has the potential to either increase or decrease sampling variance depending on the application. We carry out a simulation study to gauge what effect it will have when sampling Soil Organic Carbon (SOC) values in a large agricultural region in south-eastern Australia. Preferential sampling in this region can reduce the distance to travel by up to 16%. Our study is based on a dataset of predicted SOC values produced from a datamining exercise. We consider three designs and two ways to determine ease of access. The overall conclusion is that sampling performance deteriorates as the strength of preferential sampling increases, due to the fact the regions of high SOC are harder to access. So our designs are inadvertently targeting regions of low SOC value. The good news, however, is that Generalised Random Tessellation Stratification (GRTS) sampling designs are not as badly affected as others and GRTS remains an efficient design compared to competitors.
Resumo:
Background Sexually-transmitted pathogens often have severe reproductive health implications if treatment is delayed or absent, especially in females. The complex processes of disease progression, namely replication and ascension of the infection through the genital tract, span both extracellular and intracellular physiological scales, and in females can vary over the distinct phases of the menstrual cycle. The complexity of these processes, coupled with the common impossibility of obtaining comprehensive and sequential clinical data from individual human patients, makes mathematical and computational modelling valuable tools in developing our understanding of the infection, with a view to identifying new interventions. While many within-host models of sexually-transmitted infections (STIs) are available in existing literature, these models are difficult to deploy in clinical/experimental settings since simulations often require complex computational approaches. Results We present STI-GMaS (Sexually-Transmitted Infections – Graphical Modelling and Simulation), an environment for simulation of STI models, with a view to stimulating the uptake of these models within the laboratory or clinic. The software currently focuses upon the representative case-study of Chlamydia trachomatis, the most common sexually-transmitted bacterial pathogen of humans. Here, we demonstrate the use of a hybrid PDE–cellular automata model for simulation of a hypothetical Chlamydia vaccination, demonstrating the effect of a vaccine-induced antibody in preventing the infection from ascending to above the cervix. This example illustrates the ease with which existing models can be adapted to describe new studies, and its careful parameterisation within STI-GMaS facilitates future tuning to experimental data as they arise. Conclusions STI-GMaS represents the first software designed explicitly for in-silico simulation of STI models by non-theoreticians, thus presenting a novel route to bridging the gap between computational and clinical/experimental disciplines. With the propensity for model reuse and extension, there is much scope within STI-GMaS to allow clinical and experimental studies to inform model inputs and drive future model development. Many of the modelling paradigms and software design principles deployed to date transfer readily to other STIs, both bacterial and viral; forthcoming releases of STI-GMaS will extend the software to incorporate a more diverse range of infections.
Resumo:
Dodecylamine was successfully intercalated into the layer space of kaolinite by utilizing the methanol treated kaolinite–dimethyl sulfoxide (DMSO) intercalation complex as an intermediate. The basal spacing of kaolinite, measured by X-ray diffraction (XRD), increased from 0.72 nm to 4.29 nm after the intercalation of dodecylamine. Also, the significant variation observed in the Fourier Transform Infrared Spectroscopy (FTIR) spectra of kaolinite when intercalated with dodecylamine verified the feasibility of intercalation of dodecylamine into kaolinite. Isothermal-isobaric (NPT) molecular dynamics simulation with the use of Dreiding force field was performed to probe into the layering behavior and structure of nanoconfined dodecylamine in the kaolinite gallery. The concentration profiles of the nitrogen atom, methyl group and methylene group of intercalated dodecylamine molecules in the direction perpendicular to the kaolinite basal surface indicated that the alkyl chains within the interlayer space of kaolinite exhibited an obvious layering structure. However, the unified bilayer, pseudo-trilayer, or paraffin-type arrangements of alkyl chains deduced based on their chain length combined with the measured basal spacing of organoclays were not found in this study. The alkyl chains aggregated to a mixture of ordered paraffin-type-like structure and disordered gauche conformation in the middle interlayer space of kaolinite, and some alkyl chains arranged in two bilayer structures, in which one was close to the silica tetrahedron surface, and the other was close to the alumina octahedron surface with their alkyl chains parallel to the kaolinite basal surface.
Resumo:
This paper considers two problems that frequently arise in dynamic discrete choice problems but have not received much attention with regard to simulation methods. The first problem is how to simulate unbiased simulators of probabilities conditional on past history. The second is simulating a discrete transition probability model when the underlying dependent variable is really continuous. Both methods work well relative to reasonable alternatives in the application discussed. However, in both cases, for this application, simpler methods also provide reasonably good results.
Resumo:
In this paper, I present a number of leading examples in the empirical literature that use simulation-based estimation methods. For each example, I describe the model, why simulation is needed, and how to simulate the relevant object. There is a section on simulation methods and another on simulations-based estimation methods. The paper concludes by considering the significance of each of the examples discussed a commenting on potential future areas of interest.
Resumo:
Convectively driven downburst winds pose a threat to structures and communities in many regions of Australia not subject to tropical cyclones. Extreme value analysis shows that for return periods of interest to engineering design these events produce higher gust wind speeds than synoptic scale windstorms. Despite this, comparatively little is known of the near ground wind structure of these potentially hazardous windstorms. With this in mind, a series of idealised three-dimensional numerical simulations were undertaken to investigate convective storm wind fields. A dry, non-hydrostatic, sub-cloud model with parameterisation of the microphysics was used. Simulations were run with a uniform 20 m horizontal grid resolution and a variable vertical resolution increasing from 1 m. A systematic grid resolution study showed further refinement did not alter the morphological structure of the outflow. Simulations were performed for stationary downbursts in a quiescent air field, stationary downbursts embedded within environmental boundary layer winds, and also translating downbursts embedded within environmental boundary layer winds.
Resumo:
A thunderstorm downburst in its simplest form can be modelled as a steady flow impinging air jet. Although this simplification neglects some important atmospheric and physical parameters it has proven to be a useful tool for understanding the kinematics of these events. Assuming this simple impinging jet model also allows numerical models to be developed which can be directly compared with experimental results to validate the use of CFD. Confidence gained from these simulations will allow the use of more complex atmospheric impinging jet models that cannot be directly validated. Thunderstorm downbursts are important for wind engineers because in many parts of the world they produce the design wind speeds used in design standards, but are not structurally represented in these documents.
Resumo:
The objective of this chapter is to provide an overview of traffic data collection that can and should be used for the calibration and validation of traffic simulation models. There are big differences in availability of data from different sources. Some types of data such as loop detector data are widely available and used. Some can be measured with additional effort, for example, travel time data from GPS probe vehicles. Some types such as trajectory data are available only in rare situations such as research projects.
Resumo:
Arc discharge ablation with a catalyst-filled carbon anode in a helium background was used for the synthesis of graphene and carbon nanotubes. In this paper, we present the results of the numerical simulation of the distribution of various plasma parameters in discharge, as well as the distribution of carbon flux on the nanotube surface, for the typical discharge with an arc current of 60 A and a background gas pressure of 68 kPa.
Resumo:
Using advanced visualization techniques, a comprehensive visualization of all the stages of the self-organized growth of internetworked nanostructures on plasma-exposed surface has been made. Atomistic kinetic Monte Carlo simulation for the initial stage of deposition, with 3-D visualization of the whole system and half-tone visualization of the density field of the adsorbed atoms, makes it possible to implement a multiscale predictive modeling of the development of the nanoscale system.