378 resultados para discrete-event simulation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are large uncertainties in the aerothermodynamic modelling of super-orbital re-entry which impact the design of spacecraft thermal protection systems (TPS). Aspects of the thermal environment of super-orbital re-entry flows can be simulated in the laboratory using arc- and plasma jet facilities and these devices are regularly used for TPS certification work [5]. Another laboratory device which is capable of simulating certain critical features of both the aero and thermal environment of super-orbital re-entry is the expansion tube, and three such facilities have been operating at the University of Queensland in recent years[10]. Despite some success, wind tunnel tests do not achieve full simulation, however, a virtually complete physical simulation of particular re-entry conditions can be obtained from dedicated flight testing, and the Apollo era FIRE II flight experiment [2] is the premier example which still forms an important benchmark for modern simulations. Dedicated super-orbital flight testing is generally considered too expensive today, and there is a reluctance to incorporate substantial instrumentation for aerothermal diagnostics into existing missions since it may compromise primary mission objectives. An alternative approach to on-board flight measurements, with demonstrated success particularly in the ‘Stardust’ sample return mission, is remote observation of spectral emissions from the capsule and shock layer [8]. JAXA’s ‘Hayabusa’ sample return capsule provides a recent super-orbital reentry example through which we illustrate contributions in three areas: (1) physical simulation of super-orbital re-entry conditions in the laboratory; (2) computational simulation of such flows; and (3) remote acquisition of optical emissions from a super-orbital re entry event.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Discrete stochastic simulations, via techniques such as the Stochastic Simulation Algorithm (SSA) are a powerful tool for understanding the dynamics of chemical kinetics when there are low numbers of certain molecular species. However, an important constraint is the assumption of well-mixedness and homogeneity. In this paper, we show how to use Monte Carlo simulations to estimate an anomalous diffusion parameter that encapsulates the crowdedness of the spatial environment. We then use this parameter to replace the rate constants of bimolecular reactions by a time-dependent power law to produce an SSA valid in cases where anomalous diffusion occurs or the system is not well-mixed (ASSA). Simulations then show that ASSA can successfully predict the temporal dynamics of chemical kinetics in a spatially constrained environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biologists are increasingly conscious of the critical role that noise plays in cellular functions such as genetic regulation, often in connection with fluctuations in small numbers of key regulatory molecules. This has inspired the development of models that capture this fundamentally discrete and stochastic nature of cellular biology - most notably the Gillespie stochastic simulation algorithm (SSA). The SSA simulates a temporally homogeneous, discrete-state, continuous-time Markov process, and of course the corresponding probabilities and numbers of each molecular species must all remain positive. While accurately serving this purpose, the SSA can be computationally inefficient due to very small time stepping so faster approximations such as the Poisson and Binomial τ-leap methods have been suggested. This work places these leap methods in the context of numerical methods for the solution of stochastic differential equations (SDEs) driven by Poisson noise. This allows analogues of Euler-Maruyuma, Milstein and even higher order methods to be developed through the Itô-Taylor expansions as well as similar derivative-free Runge-Kutta approaches. Numerical results demonstrate that these novel methods compare favourably with existing techniques for simulating biochemical reactions by more accurately capturing crucial properties such as the mean and variance than existing methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When compared with other arthoplasties, Total Ankle Joint Replacement (TAR) is much less successful. Attempts to remedy this situation by modifying the implant design, for example by making its form more akin to the original ankle anatomy, have largely met with failure. One of the major obstacles is a gap in current knowledge relating to ankle joint force. Specifically this is the lack of reliable data quantifying forces and moments acting on the ankle, in both the healthy and diseased joints. The limited data that does exist is thought to be inaccurate [1] and is based upon simplistic two dimensional discrete and outdated techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The serviceability and safety of bridges are crucial to people’s daily lives and to the national economy. Every effort should be taken to make sure that bridges function safely and properly as any damage or fault during the service life can lead to transport paralysis, catastrophic loss of property or even casualties. Nonetheless, aggressive environmental conditions, ever-increasing and changing traffic loads and aging can all contribute to bridge deterioration. With often constrained budget, it is of significance to identify bridges and bridge elements that should be given higher priority for maintenance, rehabilitation or replacement, and to select optimal strategy. Bridge health prediction is an essential underpinning science to bridge maintenance optimization, since the effectiveness of optimal maintenance decision is largely dependent on the forecasting accuracy of bridge health performance. The current approaches for bridge health prediction can be categorised into two groups: condition ratings based and structural reliability based. A comprehensive literature review has revealed the following limitations of the current modelling approaches: (1) it is not evident in literature to date that any integrated approaches exist for modelling both serviceability and safety aspects so that both performance criteria can be evaluated coherently; (2) complex system modelling approaches have not been successfully applied to bridge deterioration modelling though a bridge is a complex system composed of many inter-related bridge elements; (3) multiple bridge deterioration factors, such as deterioration dependencies among different bridge elements, observed information, maintenance actions and environmental effects have not been considered jointly; (4) the existing approaches are lacking in Bayesian updating ability to incorporate a variety of event information; (5) the assumption of series and/or parallel relationship for bridge level reliability is always held in all structural reliability estimation of bridge systems. To address the deficiencies listed above, this research proposes three novel models based on the Dynamic Object Oriented Bayesian Networks (DOOBNs) approach. Model I aims to address bridge deterioration in serviceability using condition ratings as the health index. The bridge deterioration is represented in a hierarchical relationship, in accordance with the physical structure, so that the contribution of each bridge element to bridge deterioration can be tracked. A discrete-time Markov process is employed to model deterioration of bridge elements over time. In Model II, bridge deterioration in terms of safety is addressed. The structural reliability of bridge systems is estimated from bridge elements to the entire bridge. By means of conditional probability tables (CPTs), not only series-parallel relationship but also complex probabilistic relationship in bridge systems can be effectively modelled. The structural reliability of each bridge element is evaluated from its limit state functions, considering the probability distributions of resistance and applied load. Both Models I and II are designed in three steps: modelling consideration, DOOBN development and parameters estimation. Model III integrates Models I and II to address bridge health performance in both serviceability and safety aspects jointly. The modelling of bridge ratings is modified so that every basic modelling unit denotes one physical bridge element. According to the specific materials used, the integration of condition ratings and structural reliability is implemented through critical failure modes. Three case studies have been conducted to validate the proposed models, respectively. Carefully selected data and knowledge from bridge experts, the National Bridge Inventory (NBI) and existing literature were utilised for model validation. In addition, event information was generated using simulation to demonstrate the Bayesian updating ability of the proposed models. The prediction results of condition ratings and structural reliability were presented and interpreted for basic bridge elements and the whole bridge system. The results obtained from Model II were compared with the ones obtained from traditional structural reliability methods. Overall, the prediction results demonstrate the feasibility of the proposed modelling approach for bridge health prediction and underpin the assertion that the three models can be used separately or integrated and are more effective than the current bridge deterioration modelling approaches. The primary contribution of this work is to enhance the knowledge in the field of bridge health prediction, where more comprehensive health performance in both serviceability and safety aspects are addressed jointly. The proposed models, characterised by probabilistic representation of bridge deterioration in hierarchical ways, demonstrated the effectiveness and pledge of DOOBNs approach to bridge health management. Additionally, the proposed models have significant potential for bridge maintenance optimization. Working together with advanced monitoring and inspection techniques, and a comprehensive bridge inventory, the proposed models can be used by bridge practitioners to achieve increased serviceability and safety as well as maintenance cost effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent management research has evidenced the significance of organizational social networks, and communication is believed to impact the interpersonal relationships. However, we have little knowledge on how communication affects organizational social networks. This paper studies the dynamics between organizational communication patterns and the growth of organizational social networks. We propose an organizational social network growth model, and then collect empirical data to test model validity. The simulation results agree well with the empirical data. The results of simulation experiments enrich our knowledge on communication with the findings that organizational management practices that discourage employees from communicating within and across group boundaries have disparate and significant negative effect on the social network’s density, scalar assortativity and discrete assortativity, each of which correlates with the organization’s performance. These findings also suggest concrete measures for management to construct and develop the organizational social network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traffic safety studies demand more than what current micro-simulation models can provide as they presume that all drivers of motor vehicles exhibit safe behaviours. Several car-following models are used in various micro-simulation models. This research compares the mainstream car following models’ capabilities of emulating precise driver behaviour parameters such as headways and Time to Collisions. The comparison firstly illustrates which model is more robust in the metric reproduction. Secondly, the study conducted a series of sensitivity tests to further explore the behaviour of each model. Based on the outcome of these two steps exploration of the models, a modified structure and parameters adjustment for each car-following model is proposed to simulate more realistic vehicle movements, particularly headways and Time to Collision, below a certain critical threshold. NGSIM vehicle trajectory data is used to evaluate the modified models performance to assess critical safety events within traffic flow. The simulation tests outcomes indicate that the proposed modified models produce better frequency of critical Time to Collision than the generic models, while the improvement on the headway is not significant. The outcome of this paper facilitates traffic safety assessment using microscopic simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

User evaluations using paper prototypes commonly lack social context. The Group simulation technique described in this paper offers a solution to this problem. The study introduces an early-phase participatory design technique targeted for small groups. The proposed technique is used for evaluating an interface, which enables group work in photo collection creation. Three groups of four users, 12 in total, took part in a simulation session where they tested a low-fidelity design concept that included their own personal photo content from an event that their group attended together. The users’ own content was used to evoke natural experiences. Our results indicate that the technique helped users to naturally engage with the prototype in the session. The technique is suggested to be suitable for evaluating other early-phase concepts and to guide design solutions, especially with the concepts that include users’ personal content and enable content sharing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study was a step forward in modeling, simulation and microcontroller implementation of a high performance control algorithm for the motor of a blood pump. The rotor angle is sensed using three Hall effect sensors and an algorithm is developed to obtain better angular resolution from the three signals for better discrete-time updates of the controller. The performance of the system was evaluated in terms of actual and reference speeds, stator currents and power consumption over a range of reference speeds up to 4000 revolutions per minute. The use of fewer low cost Hall effect sensors compared to expensive high resolution sensors could reduce the cost of blood pumps for total artificial hearts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The wind field of an intense idealised downburst wind storm has been studied using an axisymmetric, dry, non-hydrostatic numerical sub-cloud model. The downburst driving processes of evaporation and melting have been paramaterized by an imposed cooling source that triggers and sustains a downdraft. The simulated downburst exhibits many characteristics of observed full-scale downburst events, in particular the presence of a primary and counter rotating secondary ring vortex at the leading edge of the diverging front. The counter-rotating vortex is shown to significantly influence the development and structure of the outflow. Numerical forcing and environmental characteristics have been systematically varied to determine the influence on the outflow wind field. Normalised wind structure at the time of peak outflow intensity was generally shown to remain constant for all simulations. Enveloped velocity profiles considering the velocity structure throughout the entire storm event show much more scatter. Assessing the available kinetic energy within each simulated storm event, it is shown that the simulated downburst wind events had significantly less energy available for loading isolated structures when compared with atmospheric boundary layer winds. The discrepancy is shown to be particularly prevalent when wind speeds were integrated over heights representative of tall buildings. A similar analysis for available full scale measurements led to similar findings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Convective downburst wind storms generate the peak annual gust wind speed for many parts of the non-cyclonic world at return periods of importance for ultimate limit state design. Despite this there is little clear understanding of how to appropriately design for these wind events given their significant dissimilarities to boundary layer winds upon which most design is based. To enhance the understanding of wind fields associated with these storms a three-dimensional numerical model was developed to simulate a multitude of idealised downburst scenarios and to investigate their near-ground wind characteristics. Stationary and translating downdraft wind events in still and sheared environments were simulated with baseline results showing good agreement with previous numerical work and full-scale observational data. Significant differences are shown in the normalised peak wind speed velocity profiles depending on the environmental wind conditions in the vicinity of the simulated event. When integrated over the height of mid- to high rise structures, all simulated profiles are shown to produce wind loads smaller than an equivalent 10 m height matched open terrain boundary layer profile. This suggests that for these structures the current design approach is conservative from an ultimate loading standpoint. Investigating the influence of topography on the structure of the simulated near-ground downburst wind fields, it is shown that these features amplify wind speeds in a manner similar to that expected for boundary layer winds, but the extent of amplification is reduced. The level of reduction is shown to be dependent on the depth of the simulated downburst outflow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dose-finding designs estimate the dose level of a drug based on observed adverse events. Relatedness of the adverse event to the drug has been generally ignored in all proposed design methodologies. These designs assume that the adverse events observed during a trial are definitely related to the drug, which can lead to flawed dose-level estimation. We incorporate adverse event relatedness into the so-called continual reassessment method. Adverse events that have ‘doubtful’ or ‘possible’ relationships to the drug are modelled using a two-parameter logistic model with an additive probability mass. Adverse events ‘probably’ or ‘definitely’ related to the drug are modelled using a cumulative logistic model. To search for the maximum tolerated dose, we use the maximum estimated toxicity probability of these two adverse event relatedness categories. We conduct a simulation study that illustrates the characteristics of the design under various scenarios. This article demonstrates that adverse event relatedness is important for improved dose estimation. It opens up further research pathways into continual reassessment design methodologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Plant based dried food products are popular commodities in global market where much research is focused to improve the products and processing techniques. In this regard, numerical modelling is highly applicable and in this work, a coupled meshfree particle-based two-dimensional (2-D) model was developed to simulate micro-scale deformations of plant cells during drying. Smoothed Particle Hydrodynamics (SPH) was used to model the viscous cell protoplasm (cell fluid) by approximating it to an incompressible Newtonian fluid. The visco-elastic characteristic of the cell wall was approximated to a Neo-Hookean solid material augmented with a viscous term and modelled with a Discrete Element Method (DEM). Compared to a previous work [H. C. P. Karunasena, W. Senadeera, Y. T. Gu and R. J. Brown, Appl. Math. Model., 2014], this study proposes three model improvements: linearly decreasing positive cell turgor pressure during drying, cell wall contraction forces and cell wall drying. The improvements made the model more comparable with experimental findings on dried cell morphology and geometric properties such as cell area, diameter, perimeter, roundness, elongation and compactness. This single cell model could be used as a building block for advanced tissue models which are highly applicable for product and process optimizations in Food Engineering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present fully Bayesian experimental designs for nonlinear mixed effects models, in which we develop simulation-based optimal design methods to search over both continuous and discrete design spaces. Although Bayesian inference has commonly been performed on nonlinear mixed effects models, there is a lack of research into performing Bayesian optimal design for nonlinear mixed effects models that require searches to be performed over several design variables. This is likely due to the fact that it is much more computationally intensive to perform optimal experimental design for nonlinear mixed effects models than it is to perform inference in the Bayesian framework. In this paper, the design problem is to determine the optimal number of subjects and samples per subject, as well as the (near) optimal urine sampling times for a population pharmacokinetic study in horses, so that the population pharmacokinetic parameters can be precisely estimated, subject to cost constraints. The optimal sampling strategies, in terms of the number of subjects and the number of samples per subject, were found to be substantially different between the examples considered in this work, which highlights the fact that the designs are rather problem-dependent and require optimisation using the methods presented in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Precise control of composition and internal structure is essential for a variety of novel technological applications which require highly tailored binary quantum dots (QDs) with predictable optoelectronic and mechanical properties. The delicate balancing act between incoming flux and substrate temperature required for the growth of compositionally graded (Si1-xC x; x varies throughout the internal structure), core-multishell (discrete shells of Si and C or combinations thereof) and selected composition (x set) QDs on low-temperature plasma/ion-flux-exposed Si(100) surfaces is investigated via a hybrid numerical simulation. Incident Si and C ions lead to localized substrate heating and a reduction in surface diffusion activation energy. It is shown that by incorporating ions in the influx, a steady-state composition is reached more quickly (for selected composition QDs) and the composition gradient of a Si1-xCx QD may be fine tuned; additionally (with other deposition conditions remaining the same), larger QDs are obtained on average. It is suggested that ionizing a portion of the influx is another way to control the average size of the QDs, and ultimately, their internal structure. Advantages that can be gained by utilizing plasma/ion-related controls to facilitate the growth of highly tailored, compositionally controlled quantum dots are discussed as well.