954 resultados para Discrete-events simulation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The motor system can no longer be considered as a mere passive executive system of motor commands generated elsewhere in the brain. On the contrary, it is deeply involved in perceptual and cognitive functions and acts as an “anticipation device”. The present thesis investigates the anticipatory motor mechanisms occurring in two particular instances: i) when processing sensory events occurring within the peripersonal space (PPS); and ii) when perceiving and predicting others’actions. The first study provides evidence that PPS representation in humans modulates neural activity within the motor system, while the second demonstrates that the motor mapping of sensory events occurring within the PPS critically relies on the activity of the premotor cortex. The third study provides direct evidence that the anticipatory motor simulation of others’ actions critically relies on the activity of the anterior node of the action observation network (AON), namely the inferior frontal cortex (IFC). The fourth study, sheds light on the pivotal role of the left IFC in predicting the future end state of observed right-hand actions. Finally, the fifth study examines how the ability to predict others’ actions could be influenced by a reduction of sensorimotor experience due to the traumatic or congenital loss of a limb. Overall, the present work provides new insights on: i) the anticipatory mechanisms of the basic reactivity of the motor system when processing sensory events occurring within the PPS, and the same anticipatory motor mechanisms when perceiving others’ implied actions; ii) the functional connectivity and plasticity of premotor-motor circuits both during the motor mapping of sensory events occurring within the PPS and when perceiving others’ actions; and iii) the anticipatory mechanisms related to others’ actions prediction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The experiments at the Large Hadron Collider at the European Centre for Particle Physics, CERN, rely on efficient and reliable trigger systems for singling out interesting events. This thesis documents two online timing monitoring tools for the central trigger of the ATLAS experiment as well as the adaption of the central trigger simulation as part of the upgrade for the second LHC run. Moreover, a search for candidates for so-called Dark Matter, for which there is ample cosmological evidence, is presented. This search for generic weakly interacting massive particles (WIMPs) is based on the roughly 20/fb of proton-proton collisions at a centre-of-mass-energy of sqrt{s}=8 TeV recorded with the ATLAS detector in 2012. The considered signature are events with a highly energetic jet and large missing transverse energy. No significant deviation from the theory prediction is observed. Exclusion limits are derived on parameters of different signal models and compared to the results of other experiments. Finally, the results of a simulation study on the potential of the analysis at sqrt{s}=14 TeV are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In these last years, systems engineering has became one of the major research domains. The complexity of systems has increased constantly and nowadays Cyber-Physical Systems (CPS) are a category of particular interest: these, are systems composed by a cyber part (computer-based algorithms) that monitor and control some physical processes. Their development and simulation are both complex due to the importance of the interaction between the cyber and the physical entities: there are a lot of models written in different languages that need to exchange information among each other. Normally people use an orchestrator that takes care of the simulation of the models and the exchange of informations. This orchestrator is developed manually and this is a tedious and long work. Our proposition is to achieve to generate the orchestrator automatically through the use of Co-Modeling, i.e. by modeling the coordination. Before achieving this ultimate goal, it is important to understand the mechanisms and de facto standards that could be used in a co-modeling framework. So, I studied the use of a technology employed for co-simulation in the industry: FMI. In order to better understand the FMI standard, I realized an automatic export, in the FMI format, of the models realized in an existing software for discrete modeling: TimeSquare. I also developed a simple physical model in the existing open source openmodelica tool. Later, I started to understand how works an orchestrator, developing a simple one: this will be useful in future to generate an orchestrator automatically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human HeLa cells expressing mouse connexin30 were used to study the electrical properties of gap junction channel substates. Experiments were performed on cell pairs using a dual voltage-clamp method. Single-channel currents revealed discrete levels attributable to a main state, a residual state, and five substates interposed, suggesting the operation of six subgates provided by the six connexins of a gap junction hemichannel. Substate conductances, gamma(j,substate), were unevenly distributed between the main-state and the residual-state conductance (gamma(j,main state) = 141 pS, gamma(j,residual state) = 21 pS). Activation of the first subgate reduced the channel conductance by approximately 30%, and activation of subsequent subgates resulted in conductance decrements of 10-15% each. Current transitions between the states were fast (<2 ms). Substate events were usually demarcated by transitions from and back to the main state; transitions among substates were rare. Hence, subgates are recruited simultaneously rather than sequentially. The incidence of substate events was larger at larger gradients of V(j). Frequency and duration of substate events increased with increasing number of synchronously activated subgates. Our mathematical model, which describes the operation of gap junction channels, was expanded to include channel substates. Based on the established V(j)-sensitivity of gamma(j,main state) and gamma(j,residual state), the simulation yielded unique functions gamma(j,substate) = f(V(j)) for each substate. Hence, the spacing of subconductance levels between the channel main state and residual state were uneven and characteristic for each V(j).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To compare objective fellow and expert efficiency indices for an interventional radiology renal artery stenosis skill set with the use of a high-fidelity simulator. MATERIALS AND METHODS: The Mentice VIST simulator was used for three different renal artery stenosis simulations of varying difficulty, which were used to grade performance. Fellows' indices at three intervals throughout 1 year were compared to expert baseline performance. Seventy-four simulated procedures were performed, 63 of which were captured as audiovisual recordings. Three levels of fellow experience were analyzed: 1, 6, and 12 months of dedicated interventional radiology fellowship. The recordings were compiled on a computer workstation and analyzed. Distinct measurable events in the procedures were identified with task analysis, and data regarding efficiency were extracted. Total scores were calculated as the product of procedure time, fluoroscopy time, tools, and contrast agent volume. The lowest scores, which reflected efficient use of tools, radiation, and time, were considered to indicate proficiency. Subjective analysis of participants' procedural errors was not included in this analysis. RESULTS: Fellows' mean scores diminished from 1 month to 12 months (42,960 at 1 month, 18,726 at 6 months, and 9,636 at 12 months). The experts' mean score was 4,660. In addition, the range of variance in score diminished with increasing experience (from a range of 5,940-120,156 at 1 month to 2,436-85,272 at 6 months and 2,160-32,400 at 12 months). Expert scores ranged from 1,450 to 10,800. CONCLUSIONS: Objective efficiency indices for simulated procedures can demonstrate scores directly comparable to the level of clinical experience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The challenges posed by global climate change are motivating the investigation of strategies that can reduce the life cycle greenhouse gas (GHG) emissions of products and processes. While new construction materials and technologies have received significant attention, there has been limited emphasis on understanding how construction processes can be best managed to reduce GHG emissions. Unexpected disruptive events tend to adversely impact construction costs and delay project completion. They also tend to increase project GHG emissions. The objective of this paper is to investigate ways in which project GHG emissions can be reduced by appropriate management of disruptive events. First, an empirical analysis of construction data from a specific highway construction project is used to illustrate the impact of unexpected schedule delays in increasing project GHG emissions. Next, a simulation based methodology is described to assess the effectiveness of alternative project management strategies in reducing GHG emissions. The contribution of this paper is that it explicitly considers projects emissions, in addition to cost and project duration, in developing project management strategies. Practical application of the method discussed in this paper will help construction firms reduce their project emissions through strategic project management, and without significant investment in new technology. In effect, this paper lays the foundation for best practices in construction management that will optimize project cost and duration, while minimizing GHG emissions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the project, managers encounter numerous contingencies and are faced with the challenging task of making decisions that will effectively keep the project on track. This task is very challenging because construction projects are non-prototypical and the processes are irreversible. Therefore, it is critical to apply a methodological approach to develop a few alternative management decision strategies during the planning phase, which can be deployed to manage alternative scenarios resulting from expected and unexpected disruptions in the as-planned schedule. Such a methodology should have the following features but are missing in the existing research: (1) looking at the effects of local decisions on the global project outcomes, (2) studying how a schedule responds to decisions and disruptive events because the risk in a schedule is a function of the decisions made, (3) establishing a method to assess and improve the management decision strategies, and (4) developing project specific decision strategies because each construction project is unique and the lessons from a particular project cannot be easily applied to projects that have different contexts. The objective of this dissertation is to develop a schedule-based simulation framework to design, assess, and improve sequences of decisions for the execution stage. The contribution of this research is the introduction of applying decision strategies to manage a project and the establishment of iterative methodology to continuously assess and improve decision strategies and schedules. The project managers or schedulers can implement the methodology to develop and identify schedules accompanied by suitable decision strategies to manage a project at the planning stage. The developed methodology also lays the foundation for an algorithm towards continuously automatically generating satisfactory schedule and strategies through the construction life of a project. Different from studying isolated daily decisions, the proposed framework introduces the notion of {em decision strategies} to manage construction process. A decision strategy is a sequence of interdependent decisions determined by resource allocation policies such as labor, material, equipment, and space policies. The schedule-based simulation framework consists of two parts, experiment design and result assessment. The core of the experiment design is the establishment of an iterative method to test and improve decision strategies and schedules, which is based on the introduction of decision strategies and the development of a schedule-based simulation testbed. The simulation testbed used is Interactive Construction Decision Making Aid (ICDMA). ICDMA has an emulator to duplicate the construction process that has been previously developed and a random event generator that allows the decision-maker to respond to disruptions in the emulation. It is used to study how the schedule responds to these disruptions and the corresponding decisions made over the duration of the project while accounting for cascading impacts and dependencies between activities. The dissertation is organized into two parts. The first part presents the existing research, identifies the departure points of this work, and develops a schedule-based simulation framework to design, assess, and improve decision strategies. In the second part, the proposed schedule-based simulation framework is applied to investigate specific research problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die erzielbare Fördergeschwindigkeit bei Vibrationsförderern hängt maßgeblich von der Bewegungsfunktion des Förderorganes ab. Für die gezielte Simulation dieser Anlagen mittels der diskreten Elemente Methode (DEM) ist es notwendig die geometrisch vernetzen Förderorgannachbildungen mit praxisrelevanten Bewegungsfunktionen zu beaufschlagen. Der Artikel beschreibt die Einbindung dieser Bewegungsfunktionen in die quellenoffene DEM-Software LIGGGHTS. Während des Simulationsprozesses wird eine Bewegung vernetzter CAD-Modelle durch trigonometrische Reihen ermöglicht.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Im Beitrag wird ein neuartiges Förderprinzip zur federnden Aufnahme und zum Transport von massenhaft anfallenden Paketstrukturen vorgestellt. Das Förderprinzip beruht auf einem flächigen Tragmittel in Form eines veränderbaren, elastischen Verbundes von kleinskaligen Fördermodulen. Das konzipierte Transportprinzip mit peristaltischen Eigenschaften soll entstehende Staus der Pakete schnell auflösen und eine dedizierte Steuerung von Teilmengen zulassen, um den erforderlichen Durchsatz innerhalb eines Materialflusssystems zu erreichen. Diese Lösung ermöglicht eine sinnvolle Verknüpfung von Wirkprinzipien der Schüttgut- und Stückgutförderung zur Aufnahme und Fortbewegung von Pakete als Schüttgut. Die Grundfunktionalität des Förderkonzepts wird durch die numerische Simulation auf Basis der Diskrete Elemente Methode sowie der Mehrkörpersimulation überprüft.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, the ability to respond to real time changes in operations and reconfigurability in equipment are likely to become essential characteristics for next generation intralogistics systems as well as the level of automation, cost effectiveness and maximum throughput. In order to cope with turbulences and the increasing level of dynamic conditions, future intralogistics systems have to feature short reaction times, high flexibility in processes and the ability to adapt to frequent changes. The increasing autonomy and complexity in processes of today’s intralogistics systems requires new and innovative management approaches, which allow a fast response to (un)anticipated events and adaptation to changing environment in order to reduce the negative consequences of these events. The ability of a system to respond effectively a disruption depends more on the decisions taken before the event than those taken during or after. In this context, anticipatory change planning can be a usable approach for managers to make contingency plans for intralogistics systems to deal with the rapidly changing marketplace. This paper proposes a simulation-based decision making framework for the anticipatory change planning of intralogistics systems. This approach includes the quantitative assessments based on the simulation in defined scenarios as well as the analysis of performance availability that combines the flexibility corridors of different performance dimensions. The implementation of the approach is illustrated on a new intralogistics technology called the Cellular Transport System.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present the Cellular Dynamic Simulator (CDS) for simulating diffusion and chemical reactions within crowded molecular environments. CDS is based on a novel event driven algorithm specifically designed for precise calculation of the timing of collisions, reactions and other events for each individual molecule in the environment. Generic mesh based compartments allow the creation / importation of very simple or detailed cellular structures that exist in a 3D environment. Multiple levels of compartments and static obstacles can be used to create a dense environment to mimic cellular boundaries and the intracellular space. The CDS algorithm takes into account volume exclusion and molecular crowding that may impact signaling cascades in small sub-cellular compartments such as dendritic spines. With the CDS, we can simulate simple enzyme reactions; aggregation, channel transport, as well as highly complicated chemical reaction networks of both freely diffusing and membrane bound multi-protein complexes. Components of the CDS are generally defined such that the simulator can be applied to a wide range of environments in terms of scale and level of detail. Through an initialization GUI, a simple simulation environment can be created and populated within minutes yet is powerful enough to design complex 3D cellular architecture. The initialization tool allows visual confirmation of the environment construction prior to execution by the simulator. This paper describes the CDS algorithm, design implementation, and provides an overview of the types of features available and the utility of those features are highlighted in demonstrations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Up to 15 people can participate in the game, which is supervised by a moderator. Households consisting of 1-5 people discuss options for diversification of household strategies. Aim of the game: By devising appropriate strategies, households seek to stand up to various types of events while improving their economic and social situation and, at the same time, taking account of ecological conditions. The annual General Community Meeting (GCM) provides an opportunity for households to create a general set-up at the local level that is more or less favourable to the strategies they are pursuing. The development of a community investment strategy, to be implemented by the GCM, and successful coordination between households will allow players to optimise their investments at the household level. The household who owns the most assets at the end of the game wins. Players participate very actively, as the game stimulates lively and interesting discussions. They find themselves confronted with different types of decision-making related to the reality of their daily lives. They explore different ways to model their own household strategies and discuss risks and opportunities. Reflections on the course of the game continually refer to the real-life situations of the participants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three teams consisting of 2 to 5 persons each play the game. Each team represents a farm. Each team decides jointly on its strategy. In annual meetings in winter, the farm teams jointly discuss, evaluate and decide on how to proceed and actions to be taken. The farms make use of three different pasture areas (village pasture, intensive pasture and summer pasture) for grazing their livestock. The carrying capacity of each pasture area is different and varies according to the season. In each season, the farms have to decide on how many livestock units to graze on which pasture. Overgrazing and pasture degradation occur if the total number of livestock units exceeds the carrying capacity of a specific pasture area. Overgrazing results in a reduction of pasture productivity. To diversify and improve their livelihood strategy farms can make individual investments to increase productivity at the farm level, eg. in fodder production or in income generating activities. At the community level, collective investments can be made which may influence livestock and household economy, e.g. rehabilitate and improve pasture productivity, improve living conditions on remote pastures etc. Events occurring in the course of the game represent different types of (risk) factors such as meteorology, market, politics etc. that may positively or negatively influence livestock production and household economy. A sustainable management of pastures requires that farms actively regulate the development of their herds, that they take measures to prevent pasture degradation and to improve pasture productivity, and that they find a balance between livestock economy and other productive activities. The game has a double aim: a) each farm aims at its economic success and prosperity, and b) the three farm teams jointly have to find and implement strategies for a sustainable use of pasture areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The majority of sensor network research deals with land-based networks, which are essentially two-dimensional, and thus the majority of simulation and animation tools also only handle such networks. Underwater sensor networks on the other hand, are essentially 3D networks because the depth at which a sensor node is located needs to be considered as well. Due to that additional dimension, specialized tools need to be used when conducting simulations for experimentation. The School of Engineering’s Underwater Sensor Network (UWSN) lab is conducting research on underwater sensor networks and requires simulation tools for 3D networks. The lab has extended NS-2, a widely used network simulator, so that it can simulate three-dimensional networks. However, NAM, a widely used network animator, currently only supports two-dimensional networks and no extensions have been implemented to give it three-dimensional capabilities. In this project, we develop a network visualization tool that functions similarly to NAM but is able to render network environments in full 3-D. It is able to take as input a NS-2 trace file (the same file taken as input by NAM), create the environment, position the sensor nodes, and animate the events of the simulation. Further, the visualization tool is easy to use, especially friendly to NAM users, as it is designed to follow the interfaces and functions similar to NAM. So far, the development has fulfilled the basic functionality. Future work includes fully functional capabilities for visualization and much improved user interfaces.