29 resultados para event-driven simulation

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Granular flow phenomena are frequently encountered in the design of process and industrial plants in the traditional fields of the chemical, nuclear and oil industries as well as in other activities such as food and materials handling. Multi-phase flow is one important branch of the granular flow. Granular materials have unusual kinds of behavior compared to normal materials, either solids or fluids. Although some of the characteristics are still not well-known yet, one thing is confirmed: the particle-particle interaction plays a key role in the dynamics of granular materials, especially for dense granular materials. At the beginning of this thesis, detailed illustration of developing two models for describing the interaction based on the results of finite-element simulation, dimension analysis and numerical simulation is presented. The first model is used to describing the normal collision of viscoelastic particles. Based on some existent models, more parameters are added to this model, which make the model predict the experimental results more accurately. The second model is used for oblique collision, which include the effects from tangential velocity, angular velocity and surface friction based on Coulomb's law. The theoretical predictions of this model are in agreement with those by finite-element simulation. I n the latter chapters of this thesis, the models are used to predict industrial granular flow and the agreement between the simulations and experiments also shows the validation of the new model. The first case presents the simulation of granular flow passing over a circular obstacle. The simulations successfully predict the existence of a parabolic steady layer and show how the characteristics of the particles, such as coefficients of restitution and surface friction affect the separation results. The second case is a spinning container filled with granular material. Employing the previous models, the simulation could also reproduce experimentally observed phenomena, such as a depression in the center of a high frequency rotation. The third application is about gas-solid mixed flow in a vertically vibrated device. Gas phase motion is added to coherence with the particle motion. The governing equations of the gas phase are solved by using the Large eddy simulation (LES) and particle motion is predicted by using the Lagrangian method. The simulation predicted some pattern formation reported by experiment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tämän työn tavoitteena oli tutkia rakeisen materiaalin kinematiikkaa ja rakentaa koelaitteisto rakeisen materiaalin leikkausjännitysvirtauksien tutkimiseen. Kokeellisessa osassa on keskitytty sisäisiin voimaheilahteluihin ja niiden ymmärtämiseen. Teoriaosassa on käyty läpi rakeisen materiaalin yleisiä ominaisuuksia ja lisäksi on esitetty kaksi eri tapaa mallintaa fysikaalisien ominaisuuksien heilahteluja rakeisessa materiaalissa. Nämä kaksi esitettyä mallinnusmenetelmää ovat skalaarinen q-malli ja simulointi. Skalaarinen q-malli määrittelee jokaiseen yksittäiseen rakeeseen kohdistuvan jännityksen, rakeen ollessa osa 2- tai 3-dimensionaalista asetelmaa. Tämän mallin perusidea on kuvata jännityksien epähomogeenisuutta, joka johtuu rakeiden satunnaisasettelusta. Simulointimallinnus perustuu event-driven algoritmiin, missä systeemin dynamiikkaa kuvataan yksittäisillä partikkelien törmäyksillä. Törmäyksien vaiheet ratkaistiin käyttämällä liikemääräyhtälöitä ja restituution määritelmää. Teoriaosuudessa käytiin vielä pieniltä osin läpi syitä jännitysheilahteluihin ja rakeisen materiaalin lukkiintumiseen. Tutkimuslaitteistolla tutkittiin rakeisen materiaalin käyttäytymistä rengasmaisessa leikkausjännitysvirtauksessa. Tutkimusosuuden päätavoitteena oli mitata partikkelien kosketuksista ja törmäyksistä johtuvia hetkellisiä voimaheilahteluja rengastilavuuden pohjalta. Rakeisena materiaalina tutkimuksessa käytettiin teräskuulia. Jännityssignaali ajan funktiona osoittaa suurta heilahtelua, joka voi olla jopa kertalukua keskiarvosta suurempaa. Tällainen suuren amplitudin omaava heilahtelu on merkittävä haittapuoli yleisesti rakeisissa materiaaleissa käytettyjen jatkuvuusmallien kanssa. Tällainen heilahtelu tekee käytetyt jatkuvuusmallit epäpäteviksi. Yleisellä tasolla jännityksien todennäköisyysjakauma on yhtäpitävä skalaarisen q-mallin tuloksien kanssa. Molemmissa tapauksissa todennäköisyysjakaumalla on eksponentiaalinen muoto.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

B2B document handling is moving from paper to electronic networks and electronic domain very rapidly. Moving, handling and transforming large electronic business documents requires a lot from the systems handling them. This paper explores new technologies such as SOA, event-driven systems and ESB and a scalable, event-driven enterprise service bus is created to demonstrate these new approaches to message handling. As an end result, we have a small but fully functional messaging system with several different components. This is the first larger Java-project done in-house, so on the side we developed our own set of best practices of Java development, setting up configurations, tools, code repositories and class naming and much more.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this thesis concurrent communication event handling is implemented using thread pool approach. Concurrent events are handled with a Reactor design pattern and multithreading is implemented using a Leader/Followers design pattern. Main focus is to evaluate behaviour of implemented model by different numbers of concurrent connections and amount of used threads. Furthermore, model feasibility in a PeerHood middleware is evaluated. Implemented model is evaluated with created test environment which enables concurrent message sending from multiple connections to the system under test. Messages round trip times are measured in the tester application. In the evaluation processing delay into system is simulated and influence of delay to the average round trip time is analysed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Formal methods provide a means of reasoning about computer programs in order to prove correctness criteria. One subtype of formal methods is based on the weakest precondition predicate transformer semantics and uses guarded commands as the basic modelling construct. Examples of such formalisms are Action Systems and Event-B. Guarded commands can intuitively be understood as actions that may be triggered when an associated guard condition holds. Guarded commands whose guards hold are nondeterministically chosen for execution, but no further control flow is present by default. Such a modelling approach is convenient for proving correctness, and the Refinement Calculus allows for a stepwise development method. It also has a parallel interpretation facilitating development of concurrent software, and it is suitable for describing event-driven scenarios. However, for many application areas, the execution paradigm traditionally used comprises more explicit control flow, which constitutes an obstacle for using the above mentioned formal methods. In this thesis, we study how guarded command based modelling approaches can be conveniently and efficiently scheduled in different scenarios. We first focus on the modelling of trust for transactions in a social networking setting. Due to the event-based nature of the scenario, the use of guarded commands turns out to be relatively straightforward. We continue by studying modelling of concurrent software, with particular focus on compute-intensive scenarios. We go from theoretical considerations to the feasibility of implementation by evaluating the performance and scalability of executing a case study model in parallel using automatic scheduling performed by a dedicated scheduler. Finally, we propose a more explicit and non-centralised approach in which the flow of each task is controlled by a schedule of its own. The schedules are expressed in a dedicated scheduling language, and patterns assist the developer in proving correctness of the scheduled model with respect to the original one.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The theoretical research of the study focused to business process management and business process modeling, the goal was to found a new business process modeling method for electrical accessories manufacturing enterprise. The focus was to find few options for business process modeling methods where company could have chosen the best one for its needs The study was carried out as a qualitative research with an action study and a case study as the most important ways collect data. In the empirical part of the study examples of company’s processes modeled with the new modeling method and process modeling process are presented. The new way of modeling processes improves especially visual presentation of the processes and improves the understanding how employees should work in the organizational interfaces of the process and in the interfaces between different processes. The results of the study is a new unified way to model company’s processes, which makes it easier to understand and create the process models. This improved readability makes it possible to reduce the costs that were created from the unclear old process models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of the thesis is to examine the long-term performance persistence and relative performance of hedge funds during bear and bull market periods. Performance metrics applied for fund rankings are raw return, Sharpe ratio, mean variance ratio and strategy distinctiveness index calculated of the original and clustered data correspondingly. Four different length combinations for selection and holding periods are employed. The persistence is examined using decile and quartile portfolio formatting approach and on the basis of Sharpe ratio and SKASR as performance metrics. The relative performance persistence is examined by comparing hedge portfolio returns during varying stock market conditions. The data is gathered from a private database covering 10,789 hedge funds and time horizon is set from January 1990 to December 2012. The results of this thesis suggest that long-term performance persistence of the hedge funds exists. The degree of persistence also depends on the performance metrics employed and length combination of selection and holding periods. The best results of performance persistence were obtained in the decile portfolio analysis on the basis of Sharpe ratio rankings for combination of 12-month selection period and the holding period of equal length. The results also suggest that the best performance persistence occurs in the Event Driven and Multi strategies. Dummy regression analysis shows that a relationship between hedge funds and stock market returns exists. Based on the results, Dedicated Short Bias, Global Macro, Managed Futures and Other strategies perform well during bear market periods. The results also indicate that the Market Neutral strategy is not absolutely market neutral and the Event Driven strategy has the best performance among all hedge strategies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Työn teoriaosuudessa tutkittiin prosessien uudelleen suunnittelua, prosessien mallintamista sekä prosessimittariston rakentamista. Työn tavoitteena oli uudelleen suunnitella organisaation sertifiointiprosessi. Tämän tavoitteen saavuttamiseksi piti mallintaa nykyinen ja uusi prosessi sekä rakentaa mittaristo, joka antaisi organisaatiolle arvokasta tietoa siitä, kuinka tehokkaasti uusi prosessi toimii. Työ suoritettiin osallistuvana toimintatutkimuksena. Diplomityön tekijä oli toiminut kohdeorganisaatiossa työntekijänä jo useita vuosia ja pystyi näinollen hyödyntämään omaa tietämystään sekä nykyisen prosessin mallintamisessa, että uuden prosessin suunnittelussa. Työn tuloksena syntyi uusi sertifiointiprosessi, joka on karsitumpi ja tehokkaampi kuin edeltäjänsä. Uusi mittaristojärjestelmä rakennettiin, jota organisaation johto kykenisi seuraamaan prosessin sidosryhmien tehokkuutta sekä tuotteiden laadun kehitystä. Sivutuotteena organisaatio sai käyttöönsä yksityiskohtaiset prosessikuvaukset, joita voidaan hyödyntää koulutusmateriaalina uutta henkilöstöä rekrytoitaessa sekä informatiivisena työkaluna esiteltäessä prosessia virallisille sertifiointitahoille.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tämän tutkimustyön kohteena on TietoEnator Oy:n kehittämän Fenix-tietojärjestelmän kapasiteettitarpeen ennustaminen. Työn tavoitteena on tutustua Fenix-järjestelmän eri osa-alueisiin, löytää tapa eritellä ja mallintaa eri osa-alueiden vaikutus järjestelmän kuormitukseen ja selvittää alustavasti mitkä parametrit vaikuttavat kyseisten osa-alueiden luomaan kuormitukseen. Osa tätä työtä on tutkia eri vaihtoehtoja simuloinnille ja selvittää eri vaihtoehtojen soveltuvuus monimutkaisten järjestelmien mallintamiseen. Kerätyn tiedon pohjaltaluodaan järjestelmäntietovaraston kuormitusta kuvaava simulaatiomalli. Hyödyntämällä mallista saatua tietoa ja tuotantojärjestelmästä mitattua tietoa mallia kehitetään vastaamaan yhä lähemmin todellisen järjestelmän toimintaa. Mallista tarkastellaan esimerkiksi simuloitua järjestelmäkuormaa ja jonojen käyttäytymistä. Tuotantojärjestelmästä mitataan eri kuormalähteiden käytösmuutoksia esimerkiksi käyttäjämäärän ja kellonajan suhteessa. Tämän työn tulosten on tarkoitus toimia pohjana myöhemmin tehtävälle jatkotutkimukselle, jossa osa-alueiden parametrisointia tarkennetaan lisää, mallin kykyä kuvata todellista järjestelmää tehostetaanja mallin laajuutta kasvatetaan.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this master's thesis a mechanical model that is driven with variable speed synchronous machine was developed. The developed mechanical model simulates the mechanics of power transmission and its torsional vibrations. The mechanical model was developed for the need of the branched mechanics of a rolling mill and the propulsion system of a tanker. First, the scope of the thesis was to clarify the concepts connected to the mechanical model. The clarified concepts are the variable speed drive, the mechanics of power transmission and the vibrationsin the power transmission. Next, the mechanical model with straight shaft line and twelve moments of inertia that existed in the beginning was developed to be branched considering the case of parallel machines and the case of parallel rolls. Additionally, the model was expanded for the need of moreaccurate simulation to up to thirty moments of inertia. The model was also enhanced to enable three phase short circuit situation of the simulated machine. After that the mechanical model was validated by comparing the results of the developed simulation tool to results of other simulation tools. The compared results are the natural frequencies and mode shapes of torsional vibration, the response of the load torque step and the stress in the mechanical system occurred by the permutation of the magnetic field that is arisen from the three phase short circuit situation. The comparisons were accomplished well and the mechanical model was validated for the compared cases. Further development to be made is to develop the load torque to be time-dependent and to install two frequency converters and two FEM modeled machines to be simulated parallel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The active magnetic bearings present a new technology which has many advantages compared to traditional bearing designs. Active magnetic bearings, however, require retainer bearings order to prevent damages in the event of a component, power or a control loop failure. In the dropdown situation, when the rotor drops from the magnetic bearings to the retainer bearings, the design parameters of the retainer bearings have a significant influence on the behaviour of the rotor. In this study, the dynamics of an active magnetic bearings supported electric motor during rotor drop on retainer bearings is studied using a multibody simulation approach. Various design parameters of retainer bearings are studied using a simulation model while results are compared with those found in literature. The retainer bearings are modelled using a detailed ball bearing model, which accounts damping and stiffness properties, oil film and friction between races and rolling elements. The model of the ball bearings includes inertia description of rollingelements. The model of the magnetic bearing system contains unbalances of the rotor and stiffness and damping properties of support. In this study, a computationally efficient contact model between the rotor and the retainer bearings is proposed. In addition, this work introduces information for the design of physicalprototype and its retainer bearings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Theultimate goal of any research in the mechanism/kinematic/design area may be called predictive design, ie the optimisation of mechanism proportions in the design stage without requiring extensive life and wear testing. This is an ambitious goal and can be realised through development and refinement of numerical (computational) technology in order to facilitate the design analysis and optimisation of complex mechanisms, mechanical components and systems. As a part of the systematic design methodology this thesis concentrates on kinematic synthesis (kinematic design and analysis) methods in the mechanism synthesis process. The main task of kinematic design is to find all possible solutions in the form of structural parameters to accomplish the desired requirements of motion. Main formulations of kinematic design can be broadly divided to exact synthesis and approximate synthesis formulations. The exact synthesis formulation is based in solving n linear or nonlinear equations in n variables and the solutions for the problem areget by adopting closed form classical or modern algebraic solution methods or using numerical solution methods based on the polynomial continuation or homotopy. The approximate synthesis formulations is based on minimising the approximation error by direct optimisation The main drawbacks of exact synthesis formulationare: (ia) limitations of number of design specifications and (iia) failure in handling design constraints- especially inequality constraints. The main drawbacks of approximate synthesis formulations are: (ib) it is difficult to choose a proper initial linkage and (iib) it is hard to find more than one solution. Recentformulations in solving the approximate synthesis problem adopts polynomial continuation providing several solutions, but it can not handle inequality const-raints. Based on the practical design needs the mixed exact-approximate position synthesis with two exact and an unlimited number of approximate positions has also been developed. The solutions space is presented as a ground pivot map but thepole between the exact positions cannot be selected as a ground pivot. In this thesis the exact synthesis problem of planar mechanism is solved by generating all possible solutions for the optimisation process ¿ including solutions in positive dimensional solution sets - within inequality constraints of structural parameters. Through the literature research it is first shown that the algebraic and numerical solution methods ¿ used in the research area of computational kinematics ¿ are capable of solving non-parametric algebraic systems of n equations inn variables and cannot handle the singularities associated with positive-dimensional solution sets. In this thesis the problem of positive-dimensional solutionsets is solved adopting the main principles from mathematical research area of algebraic geometry in solving parametric ( in the mathematical sense that all parameter values are considered ¿ including the degenerate cases ¿ for which the system is solvable ) algebraic systems of n equations and at least n+1 variables.Adopting the developed solution method in solving the dyadic equations in direct polynomial form in two- to three-precision-points it has been algebraically proved and numerically demonstrated that the map of the ground pivots is ambiguousand that the singularities associated with positive-dimensional solution sets can be solved. The positive-dimensional solution sets associated with the poles might contain physically meaningful solutions in the form of optimal defectfree mechanisms. Traditionally the mechanism optimisation of hydraulically driven boommechanisms is done at early state of the design process. This will result in optimal component design rather than optimal system level design. Modern mechanismoptimisation at system level demands integration of kinematic design methods with mechanical system simulation techniques. In this thesis a new kinematic design method for hydraulically driven boom mechanism is developed and integrated in mechanical system simulation techniques. The developed kinematic design method is based on the combinations of two-precision-point formulation and on optimisation ( with mathematical programming techniques or adopting optimisation methods based on probability and statistics ) of substructures using calculated criteria from the system level response of multidegree-of-freedom mechanisms. Eg. by adopting the mixed exact-approximate position synthesis in direct optimisation (using mathematical programming techniques) with two exact positions and an unlimitednumber of approximate positions the drawbacks of (ia)-(iib) has been cancelled.The design principles of the developed method are based on the design-tree -approach of the mechanical systems and the design method ¿ in principle ¿ is capable of capturing the interrelationship between kinematic and dynamic synthesis simultaneously when the developed kinematic design method is integrated with the mechanical system simulation techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technological development brings more and more complex systems to the consumer markets. The time required for bringing a new product to market is crucial for the competitive edge of a company. Simulation is used as a tool to model these products and their operation before actual live systems are built. The complexity of these systems can easily require large amounts of memory and computing power. Distributed simulation can be used to meet these demands. Distributed simulation has its problems. Diworse, a distributed simulation environment, was used in this study to analyze the different factors that affect the time required for the simulation of a system. Examples of these factors are the simulation algorithm, communication protocols, partitioning of the problem, distributionof the problem, capabilities of the computing and communications equipment and the external load. Offices offer vast amounts of unused capabilities in the formof idle workstations. The use of this computing power for distributed simulation requires the simulation to adapt to a changing load situation. This requires all or part of the simulation work to be removed from a workstation when the owner wishes to use the workstation again. If load balancing is not performed, the simulation suffers from the workstation's reduced performance, which also hampers the owner's work. Operation of load balancing in Diworse is studied and it is shown to perform better than no load balancing, as well as which different approaches for load balancing are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dreaming is a pure form of phenomenality, created by the brain untouched by external stimulation or behavioral activity, yet including a full range of phenomenal contents. Thus, it has been suggested that the dreaming brain could be used as a model system in a biological research program on consciousness (Revonsuo, 2006). In the present thesis, the philosophical view of biological realism is accepted, and thus, dreaming is considered as a natural biological phenomenon, explainable in naturalistic terms. The major theoretical contribution of the present thesis is that it explores dreaming from a multidisciplinary perspective, integrating information from various fields of science, such as dream research, consciousness research, evolutionary psychology, and cognitive neuroscience. Further, it places dreaming into a multilevel framework, and investigates the constitutive, etiological, and contextual explanations for dreaming. Currently, the only theory offering a full multilevel explanation for dreaming, that is, a theory including constitutive, etiological, and contextual level explanations, is the Threat Simulation Theory (TST) (Revonsuo, 2000a; 2000b). The empirical significance of the present thesis lies in the tests conducted to test this specific theory put forth to explain the form, content, and biological function of dreaming. The first step in the empirical testing of the TST was to define exact criteria for what is a ‘threatening event’ in dreams, and then to develop a detailed and reliable content analysis scale with which it is possible to empirically explore and quantify threatening events in dreams. The second step was to seek answers to the following questions derived from the TST: How frequent threatening events are in dreams? What kind of qualities these events have? How threatening events in dreams relate to the most recently encoded or the most salient memory traces of threatening events experienced in waking life? What are the effects of exposure to severe waking life threat on dreams? The results reveal that threatening events are relatively frequent in dreams, and that the simulated threats are realistic. The most common threats include aggression, are targeted mainly against the dream self, and include simulations of relevant and appropriate defensive actions. Further, real threat experiences activate the threat simulation system in a unique manner, and dream content is modulated by the activation of long term episodic memory traces with highest negative saliency. To sum up, most of the predictions of the TST tested in this thesis received considerable support. The TST presents a strong argument that explains the specific design of dreams as threat simulations. The TST also offers a plausible explanation for why dreaming would have been selected for: because dreaming interacted with the environment in such a way that enhanced fitness of ancestral humans. By referring to a single threat simulation mechanism it furthermore manages to explain a wide variety of dream content data that already exists in the literature, and to predict the overall statistical patterns of threat content in different samples of dreams. The TST and the empirical tests conducted to test the theory are a prime example of what a multidisciplinary approach to mental phenomena can accomplish. Thus far, dreaming seems to have always resided in the periphery of science, never regarded worth to be studied by the mainstream. Nevertheless, when brought to the spotlight, the study of dreaming can greatly benefit from ideas in diverse branches of science. Vice versa, knowledge learned from the study of dreaming can be applied in various disciplines. The main contribution of the present thesis lies in putting dreaming back where it belongs, that is, into the spotlight in the cross-road of various disciplines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simulation has traditionally been used for analyzing the behavior of complex real world problems. Even though only some features of the problems are considered, simulation time tends to become quite high even for common simulation problems. Parallel and distributed simulation is a viable technique for accelerating the simulations. The success of parallel simulation depends heavily on the combination of the simulation application, algorithm and message population in the simulation is sufficient, no additional delay is caused by this environment. In this thesis a conservative, parallel simulation algorithm is applied to the simulation of a cellular network application in a distributed workstation environment. This thesis presents a distributed simulation environment, Diworse, which is based on the use of networked workstations. The distributed environment is considered especially hard for conservative simulation algorithms due to the high cost of communication. In this thesis, however, the distributed environment is shown to be a viable alternative if the amount of communication is kept reasonable. Novel ideas of multiple message simulation and channel reduction enable efficient use of this environment for the simulation of a cellular network application. The distribution of the simulation is based on a modification of the well known Chandy-Misra deadlock avoidance algorithm with null messages. The basic Chandy Misra algorithm is modified by using the null message cancellation and multiple message simulation techniques. The modifications reduce the amount of null messages and the time required for their execution, thus reducing the simulation time required. The null message cancellation technique reduces the processing time of null messages as the arriving null message cancels other non processed null messages. The multiple message simulation forms groups of messages as it simulates several messages before it releases the new created messages. If the message population in the simulation is suffiecient, no additional delay is caused by this operation A new technique for considering the simulation application is also presented. The performance is improved by establishing a neighborhood for the simulation elements. The neighborhood concept is based on a channel reduction technique, where the properties of the application exclusively determine which connections are necessary when a certain accuracy for simulation results is required. Distributed simulation is also analyzed in order to find out the effect of the different elements in the implemented simulation environment. This analysis is performed by using critical path analysis. Critical path analysis allows determination of a lower bound for the simulation time. In this thesis critical times are computed for sequential and parallel traces. The analysis based on sequential traces reveals the parallel properties of the application whereas the analysis based on parallel traces reveals the properties of the environment and the distribution.