378 resultados para discrete-event simulation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we discuss the effects of white and coloured noise perturbations on the parameters of a mathematical model of bacteriophage infection introduced by Beretta and Kuang in [Math. Biosc. 149 (1998) 57]. We numerically simulate the strong solutions of the resulting systems of stochastic ordinary differential equations (SDEs), with respect to the global error, by means of numerical methods of both Euler-Taylor expansion and stochastic Runge-Kutta type.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the evaluation, design, and planning of traffic facilities and measures, traffic simulation packages are the de facto tools for consultants, policy makers, and researchers. However, the available commercial simulation packages do not always offer the desired work flow and flexibility for academic research. In many cases, researchers resort to designing and building their own dedicated models, without an intrinsic incentive (or the practical means) to make the results available in the public domain. To make matters worse, a substantial part of these efforts pertains to rebuilding basic functionality and, in many respects, reinventing the wheel. This problem not only affects the research community but adversely affects the entire traffic simulation community and frustrates the development of traffic simulation in general. For this problem to be addressed, this paper describes an open source approach, OpenTraffic, which is being developed as a collaborative effort between the Queensland University of Technology, Australia; the National Institute of Informatics, Tokyo; and the Technical University of Delft, the Netherlands. The OpenTraffic simulation framework enables academies from geographic areas and disciplines within the traffic domain to work together and contribute to a specific topic of interest, ranging from travel choice behavior to car following, and from response to intelligent transportation systems to activity planning. The modular approach enables users of the software to focus on their area of interest, whereas other functional modules can be regarded as black boxes. Specific attention is paid to a standardization of data inputs and outputs for traffic simulations. Such standardization will allow the sharing of data with many existing commercial simulation packages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Free association norms indicate that words are organized into semantic/associative neighborhoods within a larger network of words and links that bind the net together. We present evidence indicating that memory for a recent word event can depend on implicitly and simultaneously activating related words in its neighborhood. Processing a word during encoding primes its network representation as a function of the density of the links in its neighborhood. Such priming increases recall and recognition and can have long lasting effects when the word is processed in working memory. Evidence for this phenomenon is reviewed in extralist cuing, primed free association, intralist cuing, and single-item recognition tasks. The findings also show that when a related word is presented to cue the recall of a studied word, the cue activates it in an array of related words that distract and reduce the probability of its selection. The activation of the semantic network produces priming benefits during encoding and search costs during retrieval. In extralist cuing recall is a negative function of cue-to-distracter strength and a positive function of neighborhood density, cue-to-target strength, and target-to cue strength. We show how four measures derived from the network can be combined and used to predict memory performance. These measures play different roles in different tasks indicating that the contribution of the semantic network varies with the context provided by the task. We evaluate spreading activation and quantum-like entanglement explanations for the priming effect produced by neighborhood density.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose an approach which attempts to solve the problem of surveillance event detection, assuming that we know the definition of the events. To facilitate the discussion, we first define two concepts. The event of interest refers to the event that the user requests the system to detect; and the background activities are any other events in the video corpus. This is an unsolved problem due to many factors as listed below: 1) Occlusions and clustering: The surveillance scenes which are of significant interest at locations such as airports, railway stations, shopping centers are often crowded, where occlusions and clustering of people are frequently encountered. This significantly affects the feature extraction step, and for instance, trajectories generated by object tracking algorithms are usually not robust under such a situation. 2) The requirement for real time detection: The system should process the video fast enough in both of the feature extraction and the detection step to facilitate real time operation. 3) Massive size of the training data set: Suppose there is an event that lasts for 1 minute in a video with a frame rate of 25fps, the number of frames for this events is 60X25 = 1500. If we want to have a training data set with many positive instances of the event, the video is likely to be very large in size (i.e. hundreds of thousands of frames or more). How to handle such a large data set is a problem frequently encountered in this application. 4) Difficulty in separating the event of interest from background activities: The events of interest often co-exist with a set of background activities. Temporal groundtruth typically very ambiguous, as it does not distinguish the event of interest from a wide range of co-existing background activities. However, it is not practical to annotate the locations of the events in large amounts of video data. This problem becomes more serious in the detection of multi-agent interactions, since the location of these events can often not be constrained to within a bounding box. 5) Challenges in determining the temporal boundaries of the events: An event can occur at any arbitrary time with an arbitrary duration. The temporal segmentation of events is difficult and ambiguous, and also affected by other factors such as occlusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a virtual test bed for network security evaluation in mid-scale telecommunication networks. Migration from simulation scenarios towards the test bed is supported and enables researchers to evaluate experiments in a more realistic environment. We provide a comprehensive interface to manage, run and evaluate experiments. On basis of a concrete example we show how the proposed test bed can be utilized.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The evolution of classic power grids to smart grids creates chances for most participants in the energy sector. Customers can save money by reducing energy consumption, energy providers can better predict energy demand and environment benefits since lower energy consumption implies lower energy production including a decrease of emissions from plants. But information and communication systems supporting smart grids can also be subject to classical or new network attacks. Attacks can result in serious damage such as harming privacy of customers, creating economical loss and even disturb the power supply/demand balance of large regions and countries. In this paper, we give an overview about the German smart measuring architecture, protocols and security. Afterwards, we present a simulation framework which enables researchers to analyze security aspects of smart measuring scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work identifies the limitations of n-way data analysis techniques in multidimensional stream data, such as Internet chat room communications data, and establishes a link between data collection and performance of these techniques. Its contributions are twofold. First, it extends data analysis to multiple dimensions by constructing n-way data arrays known as high order tensors. Chat room tensors are generated by a simulator which collects and models actual communication data. The accuracy of the model is determined by the Kolmogorov-Smirnov goodness-of-fit test which compares the simulation data with the observed (real) data. Second, a detailed computational comparison is performed to test several data analysis techniques including svd [1], and multi-way techniques including Tucker1, Tucker3 [2], and Parafac [3].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Theme Paper for Curriculum innovation and enhancement theme AIM: This paper reports on a research project that trialled an educational strategy implemented in an undergraduate nursing curriculum. The project aimed to explore the effectiveness of ‘think aloud’ as a strategy for improving clinical reasoning for students in simulated clinical settings. BACKGROUND: Nurses are required to apply and utilise critical thinking skills to enable clinical reasoning and problem solving in the clinical setting (Lasater, 2007). Nursing students are expected to develop and display clinical reasoning skills in practice, but may struggle articulating reasons behind decisions about patient care. The ‘think aloud’ approach is an innovative learning/teaching method which can create an environment suitable for developing clinical reasoning skills in students (Banning, 2008, Lee and Ryan-Wenger, 1997). This project used the ‘think aloud’ strategy within a simulation context to provide a safe learning environment in which third year students were assisted to uncover cognitive approaches to assist in making effective patient care decisions, and improve their confidence, clinical reasoning and active critical reflection about their practice. MEHODS: In semester 2 2011 at QUT, third year nursing students undertook high fidelity simulation (some for the first time), commencing in September of 2011. There were two cohorts for strategy implementation (group 1= used think aloud as a strategy within the simulation, group 2= no specific strategy outside of nursing assessment frameworks used by all students) in relation to problem solving patient needs. The think aloud strategy was described to students in their pre-simulation briefing and allowed time for clarification of this strategy. All other aspects of the simulations remained the same, (resources, suggested nursing assessment frameworks, simulation session duration, size of simulation teams, preparatory materials). Ethics approval has been obtained for this project. RESULTS: Results of a qualitative analysis (in progress- will be completed by March 2012) of student and facilitator reports on students’ ability to meet the learning objectives of solving patient problems using clinical reasoning and experience with the ‘think aloud’ method will be presented. A comparison of clinical reasoning learning outcomes between the two groups will determine the effect on clinical reasoning for students responding to patient problems. CONCLUSIONS: In an environment of increasingly constrained clinical placement opportunities, exploration of alternate strategies to improve critical thinking skills and develop clinical reasoning and problem solving for nursing students is imperative in preparing nurses to respond to changing patient needs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Three dimensional conjugate heat transfer simulation of a standard parabolic trough thermal collector receiver is performed numerically in order to visualize and analyze the surface thermal characteristics. The computational model is developed in Ansys Fluent environment based on some simplified assumptions. Three test conditions are selected from the existing literature to verify the numerical model directly, and reasonably good agreement between the model and the test results confirms the reliability of the simulation. Solar radiation flux profile around the tube is also approximated from the literature. An in house macro is written to read the input solar flux as a heat flux wall boundary condition for the tube wall. The numerical results show that there is an abrupt variation in the resultant heat flux along the circumference of the receiver. Consequently, the temperature varies throughout the tube surface. The lower half of the horizontal receiver enjoys the maximum solar flux, and therefore, experiences the maximum temperature rise compared to the upper part with almost leveled temperature. Reasonable attributions and suggestions are made on this particular type of conjugate thermal system. The knowledge that gained so far from this study will be used to further the analysis and to design an efficient concentrator photovoltaic collector in near future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Parabolic Trough Concentrators (PTC) are the most proven solar collectors for solar thermal power plants, and are suitable for concentrating photovoltaic (CPV) applications. PV cells are sensitive to spatial uniformity of incident light and the cell operating temperature. This requires the design of CPV-PTCs to be optimised both optically and thermally. Optical modelling can be performed using Monte Carlo Ray Tracing (MCRT), with conjugate heat transfer (CHT) modelling using the computational fluid dynamics (CFD) to analyse the overall designs. This paper develops and evaluates a CHT simulation for a concentrating solar thermal PTC collector. It uses the ray tracing work by Cheng et al. (2010) and thermal performance data for LS-2 parabolic trough used in the SEGS III-VII plants from Dudley et al. (1994). This is a preliminary step to developing models to compare heat transfer performances of faceted absorbers for concentrating photovoltaic (CPV) applications. Reasonable agreement between the simulation results and the experimental data confirms the reliability of the numerical model. The model explores different physical issues as well as computational issues for this particular kind of system modeling. The physical issues include the resultant non-uniformity of the boundary heat flux profile and the temperature profile around the tube, and uneven heating of the HTF. The numerical issues include, most importantly, the design of the computational domain/s, and the solution techniques of the turbulence quantities and the near-wall physics. This simulation confirmed that optical simulation and the computational CHT simulation of the collector can be accomplished independently.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Good daylighting design in buildings not only provides a comfortable luminous environment, but also delivers energy savings and comfortable and healthy environments for building occupants. Yet, there is still no consensus on how to assess what constitutes good daylighting design. Currently amongst building performance guidelines, Daylighting factors (DF) or minimum illuminance values are the standard; however, previous research has shown the shortcomings of these metrics. New computer software for daylighting analysis contains new more advanced metrics for daylighting (Climate Base Daylight Metrics-CBDM). Yet, these tools (new metrics or simulation tools) are not currently understood by architects and are not used within architectural firms in Australia. A survey of architectural firms in Brisbane showed the most relevant tools used by industry. The purpose of this paper is to assess and compare these computer simulation tools and new tools available architects and designers for daylighting. The tools are assessed in terms of their ease of use (e.g. previous knowledge required, complexity of geometry input, etc.), efficiency (e.g. speed, render capabilities, etc.) and outcomes (e.g. presentation of results, etc. The study shows tools that are most accessible for architects, are those that import a wide variety of files, or can be integrated into the current 3d modelling software or package. These software’s need to be able to calculate for point in times simulations, and annual analysis. There is a current need in these software solutions for an open source program able to read raw data (in the form of spreadsheets) and show that graphically within a 3D medium. Currently, development into plug-in based software’s are trying to solve this need through third party analysis, however some of these packages are heavily reliant and their host program. These programs however which allow dynamic daylighting simulation, which will make it easier to calculate accurate daylighting no matter which modelling platform the designer uses, while producing more tangible analysis today, without the need to process raw data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Early–mid Cretaceous marks the confluence of three major continental-scale events in eastern Gondwana: (1) the emplacement of a Silicic Large Igneous Province (LIP) near the continental margin; (2) the volcaniclastic fill, transgression and regression of a major epicontinental seaway developed over at least a quarter of the Australian continent; and (3) epeirogenic uplift, exhumation and continental rupturing culminating in the opening of the Tasman Basin c. 84 Ma. The Whitsunday Silicic LIP event had widespread impact, producing both substantial extrusive volumes of dominantly silicic pyroclastic material and coeval first-cycle volcanogenic sediment that accumulated within many eastern Australian sedimentary basins, and principally in the Great Australian Basin system (>2 Mkm3 combined volume). The final pulse of volcanism and volcanogenic sedimentation at c. 105–95 Ma coincided with epicontinental seaway regression, which shows a lack of correspondence with the global sea-level curve, and alternatively records a wider, continental-scale effect of volcanism and rift tectonism. Widespread igneous underplating related to this LIP event is evident from high paleogeothermal gradients and regional hydrothermal fluid flow detectable in the shallow crust and over a broad region. Enhanced CO2 fluxing through sedimentary basins also records indirectly, large-scale, LIP-related mafic underplating. A discrete episode of rapid crustal cooling and exhumation began c. 100–90 Ma along the length of the eastern Australian margin, related to an enhanced phase of continental rifting that was largely amagmatic, and probably a switch from wide–more narrow rift modes. Along-margin variations in detachment fault architecture produced narrow (SE Australia) and wide continental margins with marginal, submerged continental plateaux (NE Australia). Long-lived NE-trending cross-orogen lineaments controlled the switch from narrow to wide continental margin geometries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new deterministic method for predicting simultaneous inbreeding coefficients at three and four loci is presented. The method involves calculating the conditional probability of IBD (identical by descent) at one locus given IBD at other loci, and multiplying this probability by the prior probability of the latter loci being simultaneously IBD. The conditional probability is obtained applying a novel regression model, and the prior probability from the theory of digenic measures of Weir and Cockerham. The model was validated for a finite monoecious population mating at random, with a constant effective population size, and with or without selfing, and also for an infinite population with a constant intermediate proportion of selfing. We assumed discrete generations. Deterministic predictions were very accurate when compared with simulation results, and robust to alternative forms of implementation. These simultaneous inbreeding coefficients were more sensitive to changes in effective population size than in marker spacing. Extensions to predict simultaneous inbreeding coefficients at more than four loci are now possible.