15 resultados para Discrete-events simulation

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents an experimental study and numerical study, based on the discrete element method (DEM), of bell-less charging in the blast furnace. The numerical models are based on the microscopic interaction between the particles in the blast furnace charging process. The emphasis is put on model validation, investigating several phenomena in the charging process, and on finding factors that influence the results. The study considers and simulates size segregation in the hopper discharging process, particle flow and behavior on the chute, which is the key equipment in the charging system, using mono-size spherical particles, multi-size spheres and nonspherical particles. The behavior of the particles at the burden surface and pellet percolation into a coke layer is also studied. Small-scale experiments are used to validate the DEM models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tämän tutkimustyön kohteena on TietoEnator Oy:n kehittämän Fenix-tietojärjestelmän kapasiteettitarpeen ennustaminen. Työn tavoitteena on tutustua Fenix-järjestelmän eri osa-alueisiin, löytää tapa eritellä ja mallintaa eri osa-alueiden vaikutus järjestelmän kuormitukseen ja selvittää alustavasti mitkä parametrit vaikuttavat kyseisten osa-alueiden luomaan kuormitukseen. Osa tätä työtä on tutkia eri vaihtoehtoja simuloinnille ja selvittää eri vaihtoehtojen soveltuvuus monimutkaisten järjestelmien mallintamiseen. Kerätyn tiedon pohjaltaluodaan järjestelmäntietovaraston kuormitusta kuvaava simulaatiomalli. Hyödyntämällä mallista saatua tietoa ja tuotantojärjestelmästä mitattua tietoa mallia kehitetään vastaamaan yhä lähemmin todellisen järjestelmän toimintaa. Mallista tarkastellaan esimerkiksi simuloitua järjestelmäkuormaa ja jonojen käyttäytymistä. Tuotantojärjestelmästä mitataan eri kuormalähteiden käytösmuutoksia esimerkiksi käyttäjämäärän ja kellonajan suhteessa. Tämän työn tulosten on tarkoitus toimia pohjana myöhemmin tehtävälle jatkotutkimukselle, jossa osa-alueiden parametrisointia tarkennetaan lisää, mallin kykyä kuvata todellista järjestelmää tehostetaanja mallin laajuutta kasvatetaan.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Technological development brings more and more complex systems to the consumer markets. The time required for bringing a new product to market is crucial for the competitive edge of a company. Simulation is used as a tool to model these products and their operation before actual live systems are built. The complexity of these systems can easily require large amounts of memory and computing power. Distributed simulation can be used to meet these demands. Distributed simulation has its problems. Diworse, a distributed simulation environment, was used in this study to analyze the different factors that affect the time required for the simulation of a system. Examples of these factors are the simulation algorithm, communication protocols, partitioning of the problem, distributionof the problem, capabilities of the computing and communications equipment and the external load. Offices offer vast amounts of unused capabilities in the formof idle workstations. The use of this computing power for distributed simulation requires the simulation to adapt to a changing load situation. This requires all or part of the simulation work to be removed from a workstation when the owner wishes to use the workstation again. If load balancing is not performed, the simulation suffers from the workstation's reduced performance, which also hampers the owner's work. Operation of load balancing in Diworse is studied and it is shown to perform better than no load balancing, as well as which different approaches for load balancing are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Simulation has traditionally been used for analyzing the behavior of complex real world problems. Even though only some features of the problems are considered, simulation time tends to become quite high even for common simulation problems. Parallel and distributed simulation is a viable technique for accelerating the simulations. The success of parallel simulation depends heavily on the combination of the simulation application, algorithm and message population in the simulation is sufficient, no additional delay is caused by this environment. In this thesis a conservative, parallel simulation algorithm is applied to the simulation of a cellular network application in a distributed workstation environment. This thesis presents a distributed simulation environment, Diworse, which is based on the use of networked workstations. The distributed environment is considered especially hard for conservative simulation algorithms due to the high cost of communication. In this thesis, however, the distributed environment is shown to be a viable alternative if the amount of communication is kept reasonable. Novel ideas of multiple message simulation and channel reduction enable efficient use of this environment for the simulation of a cellular network application. The distribution of the simulation is based on a modification of the well known Chandy-Misra deadlock avoidance algorithm with null messages. The basic Chandy Misra algorithm is modified by using the null message cancellation and multiple message simulation techniques. The modifications reduce the amount of null messages and the time required for their execution, thus reducing the simulation time required. The null message cancellation technique reduces the processing time of null messages as the arriving null message cancels other non processed null messages. The multiple message simulation forms groups of messages as it simulates several messages before it releases the new created messages. If the message population in the simulation is suffiecient, no additional delay is caused by this operation A new technique for considering the simulation application is also presented. The performance is improved by establishing a neighborhood for the simulation elements. The neighborhood concept is based on a channel reduction technique, where the properties of the application exclusively determine which connections are necessary when a certain accuracy for simulation results is required. Distributed simulation is also analyzed in order to find out the effect of the different elements in the implemented simulation environment. This analysis is performed by using critical path analysis. Critical path analysis allows determination of a lower bound for the simulation time. In this thesis critical times are computed for sequential and parallel traces. The analysis based on sequential traces reveals the parallel properties of the application whereas the analysis based on parallel traces reveals the properties of the environment and the distribution.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Transportation and warehousing are large and growing sectors in the society, and their efficiency is of high importance. Transportation also has a large share of global carbondioxide emissions, which are one the leading causes of anthropogenic climate warming. Various countries have agreed to decrease their carbon emissions according to the Kyoto protocol. Transportation is the only sector where emissions have steadily increased since the 1990s, which highlights the importance of transportation efficiency. The efficiency of transportation and warehousing can be improved with the help of simulations, but models alone are not sufficient. This research concentrates on the use of simulations in decision support systems. Three main simulation approaches are used in logistics: discrete-event simulation, systems dynamics, and agent-based modeling. However, individual simulation approaches have weaknesses of their own. Hybridization (combining two or more approaches) can improve the quality of the models, as it allows using a different method to overcome the weakness of one method. It is important to choose the correct approach (or a combination of approaches) when modeling transportation and warehousing issues. If an inappropriate method is chosen (this can occur if the modeler is proficient in only one approach or the model specification is not conducted thoroughly), the simulation model will have an inaccurate structure, which in turn will lead to misleading results. This issue can further escalate, as the decision-maker may assume that the presented simulation model gives the most useful results available, even though the whole model can be based on a poorly chosen structure. In this research it is argued that simulation- based decision support systems need to take various issues into account to make a functioning decision support system. The actual simulation model can be constructed using any (or multiple) approach, it can be combined with different optimization modules, and there needs to be a proper interface between the model and the user. These issues are presented in a framework, which simulation modelers can use when creating decision support systems. In order for decision-makers to fully benefit from the simulations, the user interface needs to clearly separate the model and the user, but at the same time, the user needs to be able to run the appropriate runs in order to analyze the problems correctly. This study recommends that simulation modelers should start to transfer their tacit knowledge to explicit knowledge. This would greatly benefit the whole simulation community and improve the quality of simulation-based decision support systems as well. More studies should also be conducted by using hybrid models and integrating simulations with Graphical Information Systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Combating climate change is one of the key tasks of humanity in the 21st century. One of the leading causes is carbon dioxide emissions due to usage of fossil fuels. Renewable energy sources should be used instead of relying on oil, gas, and coal. In Finland a significant amount of energy is produced using wood. The usage of wood chips is expected to increase in the future significantly, over 60 %. The aim of this research is to improve understanding over the costs of wood chip supply chains. This is conducted by utilizing simulation as the main research method. The simulation model utilizes both agent-based modelling and discrete event simulation to imitate the wood chip supply chain. This thesis concentrates on the usage of simulation based decision support systems in strategic decision-making. The simulation model is part of a decision support system, which connects the simulation model to databases but also provides a graphical user interface for the decisionmaker. The main analysis conducted with the decision support system concentrates on comparing a traditional supply chain to a supply chain utilizing specialized containers. According to the analysis, the container supply chain is able to have smaller costs than the traditional supply chain. Also, a container supply chain can be more easily scaled up due to faster emptying operations. Initially the container operations would only supply part of the fuel needs of a power plant and it would complement the current supply chain. The model can be expanded to include intermodal supply chains as due to increased demand in the future there is not enough wood chips located close to current and future power plants.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Työn tavoitteena oli tutkia kuljetusketjun automatisoinnin kehittämistä tapahtumapohjaisen simuloinnin avulla. Työn teoria osassa käsitellään yleisellä tasolla simuloinnin teoriaa, siihen liittyviä käsitteitä ja erityisesti simulointiprojektin läpiviennin vaiheita. Tässä osassa saadaan vastaukset seuraaviin kysymyksiin: - Mitä tapahtumapohjaisella simuloinnilla tarkoitetaan? - Mitkä ovat simuloinnin hyödyt ja rajoitteet? - Mitä käyttökohteita simuloinnilla on? - Minkälaisia ohjelmia simulointiin käytetään? - Mitä vaiheita simulointiprojekti sisältää? Soveltavassa osassa tutkitaan kuinka tapahtumapohjaista simulointia hyödynnettiin Veto-Ketju -projektissa ja millaisia tuloksia rakennetulla simulointimallilla saavutettiin. Veto-Ketju -projekti käsittelee kuljetusketjun sekä varasto- ja satamakäsittelyn tehostamista automatisoinnin avulla. Projektin vetäjänä ja automaattisen tavarankäsittelyjärjestelmän toimittajana on Pesmel Oy ja satamatoimintojen ja logistiikan asiantuntijana SysOpen-konserniin kuuluva suunnittelu- ja konsulttitoimisto EP-Logistics Oy. Paperiteollisuudesta mukana ovat UPM-Kymmene ja M-real, satamaoperaattorina toimii Rauma Stevedoring ja kuljetusoperaattorina VR Cargo. Veto-Ketju –projektin simulointitutkimuksessa varmistettiin suunnitellun automaattisen junavaunujen purkausjärjestelmän toiminta ennen sen käyttöönottoa ja tutkittiin erilaisten toimintatapojen vaikutuksia satamatoimintoihin, satamassa tarvittavien resurssien määrään ja määritettiin tarvittavien varastojen koko. Simulointimallin avulla pystyttiin osoittamaan selkeästi erilaisten toimintavaihtoehtojen erot. Tällä tavoin saatiin tuotetuksi lisää tietoa päätöksenteon tueksi muun muassa järjestelmän sijoituspaikan ja mahdollisen uuden varaston rakentamisen suhteen.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Rautateillä käytettävät tavaravaunut ovat vanhenemassa hyvin nopeasti; tämä koskee niin Venäjää, Suomea, Ruotsia kuin laajemminkin Eurooppaa. Venäjällä ja Euroopassa on käytössä runsaasti vaunuja, jotka ovat jo ylittäneet niille suositeltavan käyttöiän. Silti niitä käytetään kuljetuksissa, kun näitä korvaavia uusia vaunuja ei ole tarpeeksi saatavilla. Uusimmat vaunut ovat yleensä vaunuja vuokraavien yritysten tai uusien rautatieoperaattorien hankkimia - tämä koskee erityisesti Venäjää, jossa vaunuvuokraus on noussut erittäin suosituksi vaihtoehdoksi. Ennusteissa kerrotaan vaunupulan kasvavan ainakin vuoteen 2010 saakka. Jos rautateiden suosio rahtikuljetusmuotona kasvaa, niin voimistuva vaunukysyntä jatkuu huomattavan paljon pidemmän aikaa. Euroopan ja Venäjän vaunukannan tilanne näkyy myös sitä palvelevan konepajateollisuuden ongelmina - yleisesti ottaen alan eurooppalaiset yritykset ovat heikosti kannattavia ja niiden liikevaihto ei juuri kasva, venäläiset ja ukrainalaiset yritykset ovat olleet samassa tilanteessa, joskin aivan viime vuosina tilanne on osassa kääntynyt paremmaksi. Kun näiden maanosien yritysten liikevaihtoa, voittoa ja omistaja-arvoa verrataan yhdysvaltalaisiin kilpailijoihin, huomataan että jälkimmäisten suoriutuminen on huomattavan paljon parempaa, ja näillä yrityksillä on myös kyky maksaa osinkoja omistajilleen. Tutkimuksen tarkoituksena oli kehittää uuden tyyppinen kuljetusvaunu Suomen, Venäjän sekä mahdollisesti myös Kiinan väliseen liikenteeseen. Vaunutyypin tarkoituksena olisi kyetä toimimaan monikäyttöisenä, niin raaka-aineiden kuin konttienkin kuljetuksessa, tasapainottaen kuljetusmuotojen aiheuttamaa kuljetuspaino-ongelmaa. Kehitystyön pohjana käytimme yli 1000 venäläisen vaunutyypin tietokantaa, josta valitsimme Data Envelopment Analysis -menetelmällä soveliaimmat vaunut kontinkuljetukseen (lähemmin tarkastelimme n. 40 vaunutyyppiä), jättäen mahdollisimman vähän tyhjää tilaa junaan, mutta silti kyeten kantamaan valitun konttilastin. Kun kantokykyongelmia venäläisissä vaunuissa ei useinkaan ole, on vertailu tehtävissä tavarajunan pituuden ja kokonaispainon perusteella. Simuloituamme yhdistettyihin kuljetuksiin soveliasta vaunutyyppiä käytännössä löytyvässä kuljetusverkostossa (esim. raakapuuta Suomeen tai Kiinaan ja kontteja takaisin Venäjän suuntaan), huomasimme lyhemmän vaunupituuden sisältävän kustannusetua, erityisesti raakaainekuljetuksissa, mutta myös rajanylityspaikkojen mahdollisesti vähentyessä. Lyhempi vaunutyyppi on myös joustavampi erilaisten konttipituuksien suhteen (40 jalan kontin käyttö on yleistynyt viime vuosina). Työn lopuksi ehdotamme uuden vaunutyypin tuotantotavaksi verkostomaista lähestymistapaa, jossa osa vaunusta tehtäisiin Suomessa ja osa Venäjällä ja/tai Ukrainassa. Vaunutyypin tulisi olla rekisteröity Venäjälle, sillä silloin sitä voi käyttää Suomen ja Venäjän, kuten myös soveltuvin osin Venäjän ja Kiinan välisessä liikenteessä.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dreaming is a pure form of phenomenality, created by the brain untouched by external stimulation or behavioral activity, yet including a full range of phenomenal contents. Thus, it has been suggested that the dreaming brain could be used as a model system in a biological research program on consciousness (Revonsuo, 2006). In the present thesis, the philosophical view of biological realism is accepted, and thus, dreaming is considered as a natural biological phenomenon, explainable in naturalistic terms. The major theoretical contribution of the present thesis is that it explores dreaming from a multidisciplinary perspective, integrating information from various fields of science, such as dream research, consciousness research, evolutionary psychology, and cognitive neuroscience. Further, it places dreaming into a multilevel framework, and investigates the constitutive, etiological, and contextual explanations for dreaming. Currently, the only theory offering a full multilevel explanation for dreaming, that is, a theory including constitutive, etiological, and contextual level explanations, is the Threat Simulation Theory (TST) (Revonsuo, 2000a; 2000b). The empirical significance of the present thesis lies in the tests conducted to test this specific theory put forth to explain the form, content, and biological function of dreaming. The first step in the empirical testing of the TST was to define exact criteria for what is a ‘threatening event’ in dreams, and then to develop a detailed and reliable content analysis scale with which it is possible to empirically explore and quantify threatening events in dreams. The second step was to seek answers to the following questions derived from the TST: How frequent threatening events are in dreams? What kind of qualities these events have? How threatening events in dreams relate to the most recently encoded or the most salient memory traces of threatening events experienced in waking life? What are the effects of exposure to severe waking life threat on dreams? The results reveal that threatening events are relatively frequent in dreams, and that the simulated threats are realistic. The most common threats include aggression, are targeted mainly against the dream self, and include simulations of relevant and appropriate defensive actions. Further, real threat experiences activate the threat simulation system in a unique manner, and dream content is modulated by the activation of long term episodic memory traces with highest negative saliency. To sum up, most of the predictions of the TST tested in this thesis received considerable support. The TST presents a strong argument that explains the specific design of dreams as threat simulations. The TST also offers a plausible explanation for why dreaming would have been selected for: because dreaming interacted with the environment in such a way that enhanced fitness of ancestral humans. By referring to a single threat simulation mechanism it furthermore manages to explain a wide variety of dream content data that already exists in the literature, and to predict the overall statistical patterns of threat content in different samples of dreams. The TST and the empirical tests conducted to test the theory are a prime example of what a multidisciplinary approach to mental phenomena can accomplish. Thus far, dreaming seems to have always resided in the periphery of science, never regarded worth to be studied by the mainstream. Nevertheless, when brought to the spotlight, the study of dreaming can greatly benefit from ideas in diverse branches of science. Vice versa, knowledge learned from the study of dreaming can be applied in various disciplines. The main contribution of the present thesis lies in putting dreaming back where it belongs, that is, into the spotlight in the cross-road of various disciplines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud Computing paradigm is continually evolving, and with it, the size and the complexity of its infrastructure. Assessing the performance of a Cloud environment is an essential but strenuous task. Modeling and simulation tools have proved their usefulness and powerfulness to deal with this issue. This master thesis work contributes to the development of the widely used cloud simulator CloudSim and proposes CloudSimDisk, a module for modeling and simulation of energy-aware storage in CloudSim. As a starting point, a review of Cloud simulators has been conducted and hard disk drive technology has been studied in detail. Furthermore, CloudSim has been identified as the most popular and sophisticated discrete event Cloud simulator. Thus, CloudSimDisk module has been developed as an extension of CloudSim v3.0.3. The source code has been published for the research community. The simulation results proved to be in accordance with the analytic models, and the scalability of the module has been presented for further development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of the work is to study the flow behavior and to support the design of air cleaner by dynamic simulation.In a paper printing industry, it is necessary to monitor the quality of paper when the paper is being produced. During the production, the quality of the paper can be monitored by camera. Therefore, it is necessary to keep the camera lens clean as wood particles may fall from the paper and lie on the camera lens. In this work, the behavior of the air flow and effect of the airflow on the particles at different inlet angles are simulated. Geometries of a different inlet angles of single-channel and double-channel case were constructed using ANSYS CFD Software. All the simulations were performed in ANSYS Fluent. The simulation results of single-channel and double-channel case revealed significant differences in the behavior of the flow and the particle velocity. The main conclusion from this work are in following. 1) For the single channel case the best angle was 0 degree because in that case, the air flow can keep 60% of the particles away from the lens which would otherwise stay on lens. 2) For the double channel case, the best solution was found when the angle of the first inlet was 0 degree and the angle of second inlet was 45 degree . In that case, the airflow can keep 91% of particles away from the lens which would otherwise stay on lens.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tiivistelmä: Harvennusmenetelmien vertailu ojitetun turvemaan männikössä. Simulointitutkimus