454 resultados para Simulation experiments
Resumo:
A simulation-based training system for surgical wound debridement was developed and comprises a multimedia introduction, a surgical simulator (tutorial component), and an assessment component. The simulator includes two PCs, a haptic device, and mirrored display. Debridement is performed on a virtual leg model with a shallow laceration wound superimposed. Trainees are instructed to remove debris with forceps, scrub with a brush, and rinse with saline solution to maintain sterility. Research and development issues currently under investigation include tissue deformation models using mass-spring system and finite element methods; tissue cutting using a high-resolution volumetric mesh and dynamic topology; and accurate collision detection, cutting, and soft-body haptic rendering for two devices within the same haptic space.
A hybrid simulation framework to assess the impact of renewable generators on a distribution network
Resumo:
With an increasing number of small-scale renewable generator installations, distribution network planners are faced with new technical challenges (intermittent load flows, network imbalances…). Then again, these decentralized generators (DGs) present opportunities regarding savings on network infrastructure if installed at strategic locations. How can we consider both of these aspects when building decision tools for planning future distribution networks? This paper presents a simulation framework which combines two modeling techniques: agent-based modeling (ABM) and particle swarm optimization (PSO). ABM is used to represent the different system units of the network accurately and dynamically, simulating over short time-periods. PSO is then used to find the most economical configuration of DGs over longer periods of time. The infrastructure of the framework is introduced, presenting the two modeling techniques and their integration. A case study of Townsville, Australia, is then used to illustrate the platform implementation and the outputs of a simulation.
Resumo:
A 1500-word review of Strindberg: A Life by Sue Prideaux (Yale UP, 2012). One way of classifying biographies is to divide them into those that apply their own interpretative framework – be it psychoanalytic, gender-based, socio-historical, and so on – to a given subject and those that aim to meet the subject, on their own terms, or at least in terms that the subject would recognise...
Resumo:
Brief self-report symptom checklists are often used to screen for postconcussional disorder (PCD) and posttraumatic stress disorder (PTSD) and are highly susceptible to symptom exaggeration. This study examined the utility of the five-item Mild Brain Injury Atypical Symptoms Scale (mBIAS) designed for use with the Neurobehavioral Symptom Inventory (NSI) and the PTSD Checklist–Civilian (PCL–C). Participants were 85 Australian undergraduate students who completed a battery of self-report measures under one of three experimental conditions: control (i.e., honest responding, n = 24), feign PCD (n = 29), and feign PTSD (n = 32). Measures were the mBIAS, NSI, PCL–C, Minnesota Multiphasic Personality Inventory–2, Restructured Form (MMPI–2–RF), and the Structured Inventory of Malingered Symptomatology (SIMS). Participants instructed to feign PTSD and PCD had significantly higher scores on the mBIAS, NSI, PCL–C, and MMPI–2–RF than did controls. Few differences were found between the feign PCD and feign PTSD groups, with the exception of scores on the NSI (feign PCD > feign PTSD) and PCL–C (feign PTSD > feign PCD). Optimal cutoff scores on the mBIAS of ≥8 and ≥6 were found to reflect “probable exaggeration” (sensitivity = .34; specificity = 1.0; positive predictive power, PPP = 1.0; negative predictive power, NPP = .74) and “possible exaggeration” (sensitivity = .72; specificity = .88; PPP = .76; NPP = .85), respectively. Findings provide preliminary support for the use of the mBIAS as a tool to detect symptom exaggeration when administering the NSI and PCL–C.
Resumo:
A numerical simulation method for the Red Blood Cells’ (RBC) deformation is presented in this study. The two-dimensional RBC membrane is modeled by the spring network, where the elastic stretch/compression energy and the bending energy are considered with the constraint of constant RBC surface area. Smoothed Particle Hydrodynamics (SPH) method is used to solve the Navier-Stokes equation coupled with the Plasma-RBC membrane and Cytoplasm- RBC membrane interaction. To verify the method, the motion of a single RBC is simulated in Poiseuille flow and compared with the results reported earlier. Typical motion and deformation mechanism of the RBC is observed.
Resumo:
The micro-circulation of blood plays an important role in human body by providing oxygen and nutrients to the cells and removing carbon dioxide and wastes from the cells. This process is greatly affected by the rheological properties of the Red Blood Cells (RBCs). Changes in the rheological properties of the RBCs are caused by certain human diseases such as malaria and sickle cell diseases. Therefore it is important to understand the motion and deformation mechanism of RBCs in order to diagnose and treat this kind of diseases. Although, many methods have been developed to explore the behavior of the RBCs in micro-channels, they could not explain the deformation mechanism of the RBCs properly. Recently developed Particle Methods are employed to explain the RBCs’ behavior in micro-channels more comprehensively. The main objective of this study is to critically analyze the present methods, used to model the RBC behavior in micro-channels, in order to develop a computationally efficient particle based model to describe the complete behavior of the RBCs in micro-channels accurately and comprehensively
Resumo:
The feasibility of using an in-hardware implementation of a genetic algorithm (GA) to solve the computationally expensive travelling salesman problem (TSP) is explored, especially in regard to hardware resource requirements for problem and population sizes. We investigate via numerical experiments whether a small population size might prove sufficient to obtain reasonable quality solutions for the TSP, thereby permitting relatively resource efficient hardware implementation on field programmable gate arrays (FPGAs). Software experiments on two TSP benchmarks involving 48 and 532 cities were used to explore the extent to which population size can be reduced without compromising solution quality, and results show that a GA allowed to run for a large number of generations with a smaller population size can yield solutions of comparable quality to those obtained using a larger population. This finding is then used to investigate feasible problem sizes on a targeted Virtex-7 vx485T-2 FPGA platform via exploration of hardware resource requirements for memory and data flow operations.
Resumo:
To fumigate grain stored in a silo, phosphine gas is distributed by a combination of diffusion and fan-forced advection. This initial study of the problem mainly focuses on the advection, numerically modelled as fluid flow in a porous medium. We find satisfactory agreement between the flow predictions of two Computational Fluid Dynamics packages, Comsol and Fluent. The flow predictions demonstrate that the highest velocity (>0.1 m/s) occurs less than 0.2m from the inlet and reduces drastically over one metre of silo height, with the flow elsewhere less than 0.002 m/s or 1% of the velocity injection. The flow predictions are examined to identify silo regions where phosphine dosage levels are likely to be too low for effective grain fumigation.
Resumo:
Numerical study is carried out using large eddy simulation to study the heat and toxic gases released from fires in real road tunnels. Due to disasters about tunnel fires in previous decade, it attracts increasing attention of researchers to create safe and reliable ventilation designs. In this research, a real tunnel with 10 MW fire (which approximately equals to the heat output speed of a burning bus) at the middle of tunnel is simulated using FDS (Fire Dynamic Simulator) for different ventilation velocities. Carbone monoxide concentration and temperature vertical profiles are shown for various locations to explore the flow field. It is found that, with the increase of the longitudinal ventilation velocity, the vertical profile gradients of CO concentration and smoke temperature were shown to be both reduced. However, a relatively large longitudinal ventilation velocity leads to a high similarity between the vertical profile of CO volume concentration and that of temperature rise.
Resumo:
Advances in algorithms for approximate sampling from a multivariable target function have led to solutions to challenging statistical inference problems that would otherwise not be considered by the applied scientist. Such sampling algorithms are particularly relevant to Bayesian statistics, since the target function is the posterior distribution of the unobservables given the observables. In this thesis we develop, adapt and apply Bayesian algorithms, whilst addressing substantive applied problems in biology and medicine as well as other applications. For an increasing number of high-impact research problems, the primary models of interest are often sufficiently complex that the likelihood function is computationally intractable. Rather than discard these models in favour of inferior alternatives, a class of Bayesian "likelihoodfree" techniques (often termed approximate Bayesian computation (ABC)) has emerged in the last few years, which avoids direct likelihood computation through repeated sampling of data from the model and comparing observed and simulated summary statistics. In Part I of this thesis we utilise sequential Monte Carlo (SMC) methodology to develop new algorithms for ABC that are more efficient in terms of the number of model simulations required and are almost black-box since very little algorithmic tuning is required. In addition, we address the issue of deriving appropriate summary statistics to use within ABC via a goodness-of-fit statistic and indirect inference. Another important problem in statistics is the design of experiments. That is, how one should select the values of the controllable variables in order to achieve some design goal. The presences of parameter and/or model uncertainty are computational obstacles when designing experiments but can lead to inefficient designs if not accounted for correctly. The Bayesian framework accommodates such uncertainties in a coherent way. If the amount of uncertainty is substantial, it can be of interest to perform adaptive designs in order to accrue information to make better decisions about future design points. This is of particular interest if the data can be collected sequentially. In a sense, the current posterior distribution becomes the new prior distribution for the next design decision. Part II of this thesis creates new algorithms for Bayesian sequential design to accommodate parameter and model uncertainty using SMC. The algorithms are substantially faster than previous approaches allowing the simulation properties of various design utilities to be investigated in a more timely manner. Furthermore the approach offers convenient estimation of Bayesian utilities and other quantities that are particularly relevant in the presence of model uncertainty. Finally, Part III of this thesis tackles a substantive medical problem. A neurological disorder known as motor neuron disease (MND) progressively causes motor neurons to no longer have the ability to innervate the muscle fibres, causing the muscles to eventually waste away. When this occurs the motor unit effectively ‘dies’. There is no cure for MND, and fatality often results from a lack of muscle strength to breathe. The prognosis for many forms of MND (particularly amyotrophic lateral sclerosis (ALS)) is particularly poor, with patients usually only surviving a small number of years after the initial onset of disease. Measuring the progress of diseases of the motor units, such as ALS, is a challenge for clinical neurologists. Motor unit number estimation (MUNE) is an attempt to directly assess underlying motor unit loss rather than indirect techniques such as muscle strength assessment, which generally is unable to detect progressions due to the body’s natural attempts at compensation. Part III of this thesis builds upon a previous Bayesian technique, which develops a sophisticated statistical model that takes into account physiological information about motor unit activation and various sources of uncertainties. More specifically, we develop a more reliable MUNE method by applying marginalisation over latent variables in order to improve the performance of a previously developed reversible jump Markov chain Monte Carlo sampler. We make other subtle changes to the model and algorithm to improve the robustness of the approach.
Resumo:
NeSSi (network security simulator) is a novel network simulation tool which incorporates a variety of features relevant to network security distinguishing it from general-purpose network simulators. Its capabilities such as profile-based automated attack generation, traffic analysis and support for detection algorithm plug-ins allow it to be used for security research and evaluation purposes. NeSSi has been successfully used for testing intrusion detection algorithms, conducting network security analysis and developing overlay security frameworks. NeSSi is built upon the agent framework JIAC, resulting in a distributed and extensible architecture. In this paper, we provide an overview of the NeSSi architecture as well as its distinguishing features and briefly demonstrate its application to current security research projects.
Resumo:
In order to obtain a more compact Superconducting Fault Current limiter (SFCL), a special geometry of core and AC coil is required. This results in a unique magnetic flux pattern which differs from those associated with conventional round core arrangements. In this paper the magnetic flux density within a Fault Current Limiter (FCL) is described. Both experimental and analytical approaches are considered. A small scale prototype of an FCL was constructed in order to conduct the experiments. This prototype comprises a single phase. The analysis covers both the steady state and the short-circuit condition. Simulation results were obtained using commercial software based on the Finite Element Method (FEM). The magnetic flux saturating the cores, leakage magnetic flux giving rise to electromagnetic forces and leakage magnetic flux flowing in the enclosing tank are computed.
Resumo:
A simple experimental apparatus is described in which a wide variety of vapor phase nucleation studies of refractory materials could be performed aboard NASA's KC-135 Research Aircraft. The chief advantage of a microgravity environment for these studies is the expected absence of thermally driven convective motions in the gas. The absence of convection leads to much more accurate knowledge of both the temperature distribution in the system and the time evolution of the refractory vapor concentration as a function of distance from the crucible.The evolution of the apparatus will be described as more experience is gained with the microgravity environment. Such experiments will be used to prepare for similar ones carried out aboard either the shuttle or Space Station where considerably longer duration experiments are possible.
Resumo:
In this work we discuss the effects of white and coloured noise perturbations on the parameters of a mathematical model of bacteriophage infection introduced by Beretta and Kuang in [Math. Biosc. 149 (1998) 57]. We numerically simulate the strong solutions of the resulting systems of stochastic ordinary differential equations (SDEs), with respect to the global error, by means of numerical methods of both Euler-Taylor expansion and stochastic Runge-Kutta type.
Resumo:
For the evaluation, design, and planning of traffic facilities and measures, traffic simulation packages are the de facto tools for consultants, policy makers, and researchers. However, the available commercial simulation packages do not always offer the desired work flow and flexibility for academic research. In many cases, researchers resort to designing and building their own dedicated models, without an intrinsic incentive (or the practical means) to make the results available in the public domain. To make matters worse, a substantial part of these efforts pertains to rebuilding basic functionality and, in many respects, reinventing the wheel. This problem not only affects the research community but adversely affects the entire traffic simulation community and frustrates the development of traffic simulation in general. For this problem to be addressed, this paper describes an open source approach, OpenTraffic, which is being developed as a collaborative effort between the Queensland University of Technology, Australia; the National Institute of Informatics, Tokyo; and the Technical University of Delft, the Netherlands. The OpenTraffic simulation framework enables academies from geographic areas and disciplines within the traffic domain to work together and contribute to a specific topic of interest, ranging from travel choice behavior to car following, and from response to intelligent transportation systems to activity planning. The modular approach enables users of the software to focus on their area of interest, whereas other functional modules can be regarded as black boxes. Specific attention is paid to a standardization of data inputs and outputs for traffic simulations. Such standardization will allow the sharing of data with many existing commercial simulation packages.