378 resultados para discrete-event simulation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simulation-based training system for surgical wound debridement was developed and comprises a multimedia introduction, a surgical simulator (tutorial component), and an assessment component. The simulator includes two PCs, a haptic device, and mirrored display. Debridement is performed on a virtual leg model with a shallow laceration wound superimposed. Trainees are instructed to remove debris with forceps, scrub with a brush, and rinse with saline solution to maintain sterility. Research and development issues currently under investigation include tissue deformation models using mass-spring system and finite element methods; tissue cutting using a high-resolution volumetric mesh and dynamic topology; and accurate collision detection, cutting, and soft-body haptic rendering for two devices within the same haptic space.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With an increasing number of small-scale renewable generator installations, distribution network planners are faced with new technical challenges (intermittent load flows, network imbalances…). Then again, these decentralized generators (DGs) present opportunities regarding savings on network infrastructure if installed at strategic locations. How can we consider both of these aspects when building decision tools for planning future distribution networks? This paper presents a simulation framework which combines two modeling techniques: agent-based modeling (ABM) and particle swarm optimization (PSO). ABM is used to represent the different system units of the network accurately and dynamically, simulating over short time-periods. PSO is then used to find the most economical configuration of DGs over longer periods of time. The infrastructure of the framework is introduced, presenting the two modeling techniques and their integration. A case study of Townsville, Australia, is then used to illustrate the platform implementation and the outputs of a simulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Cross-Entropy (CE) is an efficient method for the estimation of rare-event probabilities and combinatorial optimization. This work presents a novel approach of the CE for optimization of a Soft-Computing controller. A Fuzzy controller was designed to command an unmanned aerial system (UAS) for avoiding collision task. The only sensor used to accomplish this task was a forward camera. The CE is used to reach a near-optimal controller by modifying the scaling factors of the controller inputs. The optimization was realized using the ROS-Gazebo simulation system. In order to evaluate the optimization a big amount of tests were carried out with a real quadcopter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rapid increase in the deployment of CCTV systems has led to a greater demand for algorithms that are able to process incoming video feeds. These algorithms are designed to extract information of interest for human operators. During the past several years, there has been a large effort to detect abnormal activities through computer vision techniques. Typically, the problem is formulated as a novelty detection task where the system is trained on normal data and is required to detect events which do not fit the learned `normal' model. Many researchers have tried various sets of features to train different learning models to detect abnormal behaviour in video footage. In this work we propose using a Semi-2D Hidden Markov Model (HMM) to model the normal activities of people. The outliers of the model with insufficient likelihood are identified as abnormal activities. Our Semi-2D HMM is designed to model both the temporal and spatial causalities of the crowd behaviour by assuming the current state of the Hidden Markov Model depends not only on the previous state in the temporal direction, but also on the previous states of the adjacent spatial locations. Two different HMMs are trained to model both the vertical and horizontal spatial causal information. Location features, flow features and optical flow textures are used as the features for the model. The proposed approach is evaluated using the publicly available UCSD datasets and we demonstrate improved performance compared to other state of the art methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Brief self-report symptom checklists are often used to screen for postconcussional disorder (PCD) and posttraumatic stress disorder (PTSD) and are highly susceptible to symptom exaggeration. This study examined the utility of the five-item Mild Brain Injury Atypical Symptoms Scale (mBIAS) designed for use with the Neurobehavioral Symptom Inventory (NSI) and the PTSD Checklist–Civilian (PCL–C). Participants were 85 Australian undergraduate students who completed a battery of self-report measures under one of three experimental conditions: control (i.e., honest responding, n = 24), feign PCD (n = 29), and feign PTSD (n = 32). Measures were the mBIAS, NSI, PCL–C, Minnesota Multiphasic Personality Inventory–2, Restructured Form (MMPI–2–RF), and the Structured Inventory of Malingered Symptomatology (SIMS). Participants instructed to feign PTSD and PCD had significantly higher scores on the mBIAS, NSI, PCL–C, and MMPI–2–RF than did controls. Few differences were found between the feign PCD and feign PTSD groups, with the exception of scores on the NSI (feign PCD > feign PTSD) and PCL–C (feign PTSD > feign PCD). Optimal cutoff scores on the mBIAS of ≥8 and ≥6 were found to reflect “probable exaggeration” (sensitivity = .34; specificity = 1.0; positive predictive power, PPP = 1.0; negative predictive power, NPP = .74) and “possible exaggeration” (sensitivity = .72; specificity = .88; PPP = .76; NPP = .85), respectively. Findings provide preliminary support for the use of the mBIAS as a tool to detect symptom exaggeration when administering the NSI and PCL–C.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A numerical simulation method for the Red Blood Cells’ (RBC) deformation is presented in this study. The two-dimensional RBC membrane is modeled by the spring network, where the elastic stretch/compression energy and the bending energy are considered with the constraint of constant RBC surface area. Smoothed Particle Hydrodynamics (SPH) method is used to solve the Navier-Stokes equation coupled with the Plasma-RBC membrane and Cytoplasm- RBC membrane interaction. To verify the method, the motion of a single RBC is simulated in Poiseuille flow and compared with the results reported earlier. Typical motion and deformation mechanism of the RBC is observed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The micro-circulation of blood plays an important role in human body by providing oxygen and nutrients to the cells and removing carbon dioxide and wastes from the cells. This process is greatly affected by the rheological properties of the Red Blood Cells (RBCs). Changes in the rheological properties of the RBCs are caused by certain human diseases such as malaria and sickle cell diseases. Therefore it is important to understand the motion and deformation mechanism of RBCs in order to diagnose and treat this kind of diseases. Although, many methods have been developed to explore the behavior of the RBCs in micro-channels, they could not explain the deformation mechanism of the RBCs properly. Recently developed Particle Methods are employed to explain the RBCs’ behavior in micro-channels more comprehensively. The main objective of this study is to critically analyze the present methods, used to model the RBC behavior in micro-channels, in order to develop a computationally efficient particle based model to describe the complete behavior of the RBCs in micro-channels accurately and comprehensively

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To fumigate grain stored in a silo, phosphine gas is distributed by a combination of diffusion and fan-forced advection. This initial study of the problem mainly focuses on the advection, numerically modelled as fluid flow in a porous medium. We find satisfactory agreement between the flow predictions of two Computational Fluid Dynamics packages, Comsol and Fluent. The flow predictions demonstrate that the highest velocity (>0.1 m/s) occurs less than 0.2m from the inlet and reduces drastically over one metre of silo height, with the flow elsewhere less than 0.002 m/s or 1% of the velocity injection. The flow predictions are examined to identify silo regions where phosphine dosage levels are likely to be too low for effective grain fumigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerical study is carried out using large eddy simulation to study the heat and toxic gases released from fires in real road tunnels. Due to disasters about tunnel fires in previous decade, it attracts increasing attention of researchers to create safe and reliable ventilation designs. In this research, a real tunnel with 10 MW fire (which approximately equals to the heat output speed of a burning bus) at the middle of tunnel is simulated using FDS (Fire Dynamic Simulator) for different ventilation velocities. Carbone monoxide concentration and temperature vertical profiles are shown for various locations to explore the flow field. It is found that, with the increase of the longitudinal ventilation velocity, the vertical profile gradients of CO concentration and smoke temperature were shown to be both reduced. However, a relatively large longitudinal ventilation velocity leads to a high similarity between the vertical profile of CO volume concentration and that of temperature rise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

NeSSi (network security simulator) is a novel network simulation tool which incorporates a variety of features relevant to network security distinguishing it from general-purpose network simulators. Its capabilities such as profile-based automated attack generation, traffic analysis and support for detection algorithm plug-ins allow it to be used for security research and evaluation purposes. NeSSi has been successfully used for testing intrusion detection algorithms, conducting network security analysis and developing overlay security frameworks. NeSSi is built upon the agent framework JIAC, resulting in a distributed and extensible architecture. In this paper, we provide an overview of the NeSSi architecture as well as its distinguishing features and briefly demonstrate its application to current security research projects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High resolution transmission electron microscopy of the Mighei carbonaceous chondrite matrix has revealed the presence of a new mixed layer structure material. This mixed-layer material consists of an ordered arrangement of serpentine-type (S) and brucite-type (B) layers in the sequence ... SBBSBB. ... Electron diffraction and imaging techniques show that the basal periodicity is ~ 17 Å. Discrete crystals of SBB-type material are typically curved, of small size (<1 μm) and show structural variations similar to the serpentine group minerals. Mixed-layer material also occurs in association with planar serpentine. Characteristics of SBB-type material are not consistent with known terrestrial mixed-layer clay minerals. Evidence for formation by a condensation event or by subsequent alteration of preexisting material is not yet apparent. © 1982.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Flexible information exchange is critical to successful design-analysis integration, but current top-down, standards-based and model-oriented strategies impose restrictions that contradicts this flexibility. In this article we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. We then discuss how a shared mapping process that is flexible and user friendly supports non-programmers in creating these custom connections. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We then discuss potential challenges and opportunities for its development as a flexible, visual, collaborative, scalable and open system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Flexible information exchange is critical to successful design integration, but current top-down, standards-based and model-oriented strategies impose restrictions that are contradictory to this flexibility. In this paper we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We discuss potential challenges and opportunities for the development thereof as a flexible, visual, collaborative, scalable and open system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Process mining encompasses the research area which is concerned with knowledge discovery from information system event logs. Within the process mining research area, two prominent tasks can be discerned. First of all, process discovery deals with the automatic construction of a process model out of an event log. Secondly, conformance checking focuses on the assessment of the quality of a discovered or designed process model in respect to the actual behavior as captured in event logs. Hereto, multiple techniques and metrics have been developed and described in the literature. However, the process mining domain still lacks a comprehensive framework for assessing the goodness of a process model from a quantitative perspective. In this study, we describe the architecture of an extensible framework within ProM, allowing for the consistent, comparative and repeatable calculation of conformance metrics. For the development and assessment of both process discovery as well as conformance techniques, such a framework is considered greatly valuable.