955 resultados para event-driven simulation
Resumo:
This paper examines the measurement of long-horizon abnormal performance when stock selection is conditional on an extended period of past survival. Filtering on survival results in a sample driven towards more-established, frequently traded stocks and this has implications for the choice of benchmark used in performance measurement (especially in the presence of the well-documented size effect). A simulation study is conducted to document the properties of commonly employed performance measures conditional on past survival. The results suggest that the popular index benchmarks used in long-horizon event studies are severely biased and yield test statistics that are badly misspecified. In contrast, a matched-stock benchmark based on size and industry performs consistently well. Also, an eligible-stock index designed to mitigate the influence of the size effect proves effective.
Resumo:
Numerical simulations of turbulent driven flow in a dense medium cyclone with magnetite medium have been conducted using Fluent. The predicted air core shape and diameter were found to be close to the experimental results measured by gamma ray tomography. It is possible that the Large eddy simulation (LES) turbulence model with Mixture multi-phase model can be used to predict the air/slurry interface accurately although the LES may need a finer grid. Multi-phase simulations (air/water/medium) are showing appropriate medium segregation effects but are over-predicting the level of segregation compared to that measured by gamma-ray tomography in particular with over prediction of medium concentrations near the wall. Further, investigated the accurate prediction of axial segregation of magnetite using the LES turbulence model together with the multi-phase mixture model and viscosity corrections according to the feed particle loading factor. Addition of lift forces and viscosity correction improved the predictions especially near the wall. Predicted density profiles are very close to gamma ray tomography data showing a clear density drop near the wall. The effect of size distribution of the magnetite has been fully studied. It is interesting to note that the ultra-fine magnetite sizes (i.e. 2 and 7 mu m) are distributed uniformly throughout the cyclone. As the size of magnetite increases, more segregation of magnetite occurs close to the wall. The cut-density (d(50)) of the magnetite segregation is 32 gm, which is expected with superfine magnetite feed size distribution. At higher feed densities the agreement between the [Dungilson, 1999; Wood, J.C., 1990. A performance model for coal-washing dense medium cyclones, Ph.D. Thesis, JKMRC, University of Queensland] correlations and the CFD are reasonably good, but the overflow density is lower than the model predictions. It is believed that the excessive underflow volumetric flow rates are responsible for under prediction of the overflow density. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
An appreciation of the physical mechanisms which cause observed seismicity complexity is fundamental to the understanding of the temporal behaviour of faults and single slip events. Numerical simulation of fault slip can provide insights into fault processes by allowing exploration of parameter spaces which influence microscopic and macroscopic physics of processes which may lead towards an answer to those questions. Particle-based models such as the Lattice Solid Model have been used previously for the simulation of stick-slip dynamics of faults, although mainly in two dimensions. Recent increases in the power of computers and the ability to use the power of parallel computer systems have made it possible to extend particle-based fault simulations to three dimensions. In this paper a particle-based numerical model of a rough planar fault embedded between two elastic blocks in three dimensions is presented. A very simple friction law without any rate dependency and no spatial heterogeneity in the intrinsic coefficient of friction is used in the model. To simulate earthquake dynamics the model is sheared in a direction parallel to the fault plane with a constant velocity at the driving edges. Spontaneous slip occurs on the fault when the shear stress is large enough to overcome the frictional forces on the fault. Slip events with a wide range of event sizes are observed. Investigation of the temporal evolution and spatial distribution of slip during each event shows a high degree of variability between the events. In some of the larger events highly complex slip patterns are observed.
Resumo:
The Co-III complexes of the hexadentate tripodal ligands HOsen (3-(2'-aminoethylamino)-2,2-bis((2 ''-aminoethylamino) methyl) propan-1-ol) and HOten (3-(2'-aminoethylthia)-2,2-bis((2 ''-aminoethylthia) methyl) propan-1-ol) have been synthesized and fully characterized. The crystal structures of [Co(HOsen)]Cl-3 center dot H2O and [Co(HOten)](ClO4)Cl-2 are reported and in both cases the ligands coordinate as tripodal hexadentate N-6 and N3S3 donors, respectively. Cyclic voltammetry of the N3S3 coordinated complex [Co(HOten)](3+) is complicated and electrode dependent. On a Pt working electrode an irreversible Co-III/II couple ( formal potential - 157 mV versus Ag-AgCl) is seen, which is indicative of dissociation of the divalent complex formed at the electrode. The free HOten released by the dissociation of [Co(HOten)](2+) can be recaptured by Hg as shown by cyclic voltammetry experiments on a static Hg drop electrode ( or in the presence of Hg2+ ions), which leads to the formation of an electroactive Hg-II complex of the N3S3 ligand (formal potential + 60 mV versus Ag-AgCl). This behaviour is in contrast to the facile and totally reversible voltammetry of the hexaamine complex [Co(HOsen)](3+) ( formal potential (Co-III/II) - 519 mV versus Ag-AgCl), which is uncomplicated by any coupled chemical reactions. Akinetic and thermodynamic analysis of the [Co(HOten)](2+)/[Hg(HOten)](2+) system is presented on the basis of digital simulation of the experimental voltammetric data.
Resumo:
Purpose – To investigate the role of simulation in the introduction of technology in a continuous operations process. Design/methodology/approach – A case-based research method was chosen with the aim to provide an exemplar of practice and test the proposition that the use of simulation can improve the implementation and running of conveyor systems in continuous process facilities. Findings – The research determines the optimum rate of re-introduction of inventory to a conveyor system generated during a breakdown event. Research limitations/implications – More case studies are required demonstrating the operational and strategic benefits that can be gained by using simulation to assess technology in organisations. Practical implications – A practical outcome of the study was the implementation of a policy for the manual re-introduction of inventory on a conveyor line after a breakdown event had occurred. Originality/value – The paper presents a novel example of the use of simulation to estimate the re-introduction rate of inventory after a breakdown event on a conveyor line. The paper highlights how by addressing this operational issue, ahead of implementation, the likelihood of the success of the strategic decision to acquire the technology can be improved.
Resumo:
A discrete event simulation model was developed and used to estimate the storage area required for a proposed overseas textile manufacturing facility. It was found that the simulation was able to achieve this because of its ability to both store attribute values and to show queuing levels at an individual product level. It was also found that the process of undertaking the simulation project initiated useful discussions regarding the operation of the facility. Discrete event simulation is shown to be much more than an exercise in quantitative analysis of results and an important task of the simulation project manager is to initiate a debate among decision makers regarding the assumptions of how the system operates.
Resumo:
The nature of Discrete-Event Simulation (DES) and the use of DES in organisations is changing. Two important developments are the use of Visual Interactive Modelling systems and the use of DES in Business Process Management (BPM) projects. Survey research is presented that shows that despite these developments usage of DES remains relatively low due to a lack of knowledge of the benefits of the technique. This paper considers two factors that could lead to a greater achievement and appreciation of the full benefit of DES and thus lead to greater usage. Firstly in relation to using DES to investigate social systems, both in the process of undertaking a simulation project and in the interpretation of the findings a 'soft' approach may generate more knowledge from the DES intervention and thus increase its benefit to businesses. Secondly in order to assess the full range of outcomes of DES the technique could be considered from the perspective of an information processing tool within the organisation. This will allow outcomes to be considered under the three modes of organisational information use of sense making, knowledge creating and decision making which relate to the theoretical areas of knowledge management, organisational learning and decision making respectively. The association of DES with these popular techniques could further increase its usage in business.
Resumo:
Suggests that simulation of the workflow component of a computer supported co-operative work (CSCW) system has the potential to reduce the costs of system implementation, while at the same time improving the quality of the delivered system. Demonstrates the value of being able to assess the frequency and volume of workflow transactions using a case study of CSCW software developed for estate agency co-workers in which a model was produced based on a discrete-event simulation approach with implementation on a spreadsheet platform.
Resumo:
The high capital cost of robots prohibit their economic application. One method of making their application more economic is to increase their operating speed. This can be done in a number of ways e.g. redesign of robot geometry, improving actuators and improving control system design. In this thesis the control system design is considered. It is identified in the literature review that two aspects in relation to robot control system design have not been addressed in any great detail by previous researchers. These are: how significant are the coupling terms in the dynamic equations of the robot and what is the effect of the coupling terms on the performance of a number of typical independent axis control schemes?. The work in this thesis addresses these two questions in detail. A program was designed to automatically calculate the path and trajectory and to calculate the significance of the coupling terms in an example application of a robot manipulator tracking a part on a moving conveyor. The inertial and velocity coupling terms have been shown to be of significance when the manipulator was considered to be directly driven. A simulation of the robot manipulator following the planned trajectory has been established in order to assess the performance of the independent axis control strategies. The inertial coupling was shown to reinforce the control torque at the corner points of the trajectory, where there was an abrupt demand in acceleration in each axis but of opposite sign. This reduced the tracking error however, this effect was not controllable. A second effect was due to the velocity coupling terms. At high trajectory speeds it was shown, by means of a root locus analysis, that the velocity coupling terms caused the system to become unstable.
Resumo:
T-cell activation requires interaction of T-cell receptors (TCR) with peptide epitopes bound by major histocompatibility complex (MHC) proteins. This interaction occurs at a special cell-cell junction known as the immune or immunological synapse. Fluorescence microscopy has shown that the interplay among one agonist peptide-MHC (pMHC), one TCR and one CD4 provides the minimum complexity needed to trigger transient calcium signalling. We describe a computational approach to the study of the immune synapse. Using molecular dynamics simulation, we report here on a study of the smallest viable model, a TCR-pMHC-CD4 complex in a membrane environment. The computed structural and thermodynamic properties are in fair agreement with experiment. A number of biomolecules participate in the formation of the immunological synapse. Multi-scale molecular dynamics simulations may be the best opportunity we have to reach a full understanding of this remarkable supra-macromolecular event at a cell-cell junction.
Resumo:
Computer based discrete event simulation (DES) is one of the most commonly used aids for the design of automotive manufacturing systems. However, DES tools represent machines in extensive detail, while only representing workers as simple resources. This presents a problem when modelling systems with a highly manual work content, such as an assembly line. This paper describes research at Cranfield University, in collaboration with the Ford Motor Company, founded on the assumption that human variation is the cause of a large percentage of the disparity between simulation predictions and real world performance. The research aims to improve the accuracy and reliability of simulation prediction by including models of human factors.
Resumo:
Discrete event simulation of manufacturing systems has become widely accepted as an important tool to aid the design of such systems. Often, however, it is applied by practitioners in a manner which largely ignores an important element of industry; namely, the workforce. Workers are usually represented as simple resources, often with deterministic performance values. This approach ignores the potentially large effect that human performance variation can have on a system. A long-term data collection exercise is described with the aim of quantifying the performance variation of workers in a typical automotive assembly plant. The data are presented in a histogram form which is immediately usable in simulations to improve the accuracy of design assessment. The results show levels of skewness and range which are far larger than anticipated by current researchers and practitioners in the field.
Resumo:
The heightened threat of terrorism has caused governments worldwide to plan for responding to large-scale catastrophic incidents. In England the New Dimension Programme supplies equipment, procedures and training to the Fire and Rescue Service to ensure the country's preparedness to respond to a range of major critical incidents. The Fire and Rescue Service is involved partly by virtue of being able to very quickly mobilize a large skilled workforce and specialist equipment. This paper discusses the use of discrete event simulation modeling to understand how a fire and rescue service might position its resources before an incident takes place, to best respond to a combination of different incidents at different locations if they happen. Two models are built for this purpose. The first model deals with mass decontamination of a population following a release of a hazardous substance—aiming to study resource requirements (vehicles, equipment and manpower) necessary to meet performance targets. The second model deals with the allocation of resources across regions—aiming to study cover level and response times, analyzing different allocations of resources, both centralized and decentralized. Contributions to theory and practice in other contexts (e.g. the aftermath of natural disasters such as earthquakes) are outlined.
Resumo:
This paper presents a discrete event simulation study to examine tenancy service performance in a shopping centre. The study aims to provide an understanding of how informal management mechanisms could enhance existing ERP systems. The research shows the potential benefits of combining the traditional strengths of ERP in providing better performance in terms of efficiency with the ability to react with flexibility to customer's requests. © 2012 SIMULATION COUNCILS, INC.
Resumo:
Simulation is an effective method for improving supply chain performance. However, there is limited advice available to assist practitioners in selecting the most appropriate method for a given problem. Much of the advice that does exist relies on custom and practice rather than a rigorous conceptual or empirical analysis. An analysis of the different modelling techniques applied in the supply chain domain was conducted, and the three main approaches to simulation used were identified; these are System Dynamics (SD), Discrete Event Simulation (DES) and Agent Based Modelling (ABM). This research has examined these approaches in two stages. Firstly, a first principles analysis was carried out in order to challenge the received wisdom about their strengths and weaknesses and a series of propositions were developed from this initial analysis. The second stage was to use the case study approach to test these propositions and to provide further empirical evidence to support their comparison. The contributions of this research are both in terms of knowledge and practice. In terms of knowledge, this research is the first holistic cross paradigm comparison of the three main approaches in the supply chain domain. Case studies have involved building ‘back to back’ models of the same supply chain problem using SD and a discrete approach (either DES or ABM). This has led to contributions concerning the limitations of applying SD to operational problem types. SD has also been found to have risks when applied to strategic and policy problems. Discrete methods have been found to have potential for exploring strategic problem types. It has been found that discrete simulation methods can model material and information feedback successfully. Further insights have been gained into the relationship between modelling purpose and modelling approach. In terms of practice, the findings have been summarised in the form of a framework linking modelling purpose, problem characteristics and simulation approach.