40 resultados para systems-based simulation
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Virtual Testing of Active Magnetic Bearing Systems based on Design Guidelines given by the Standards
Resumo:
Active Magnetic Bearings offer many advantages that have brought new applications to the industry. However, similarly to all new technology, active magnetic bearings also have downsides and one of those is the low standardization level. This thesis is studying mainly the ISO 14839 standard and more specifically the system verification methods. These verifying methods are conducted using a practical test with an existing active magnetic bearing system. The system is simulated with Matlab using rotor-bearing dynamics toolbox, but this study does not include the exact simulation code or a direct algebra calculation. However, this study provides the proof that standardized simulation methods can be applied in practical problems.
Resumo:
The pumping processes requiring wide range of flow are often equipped with parallelconnected centrifugal pumps. In parallel pumping systems, the use of variable speed control allows that the required output for the process can be delivered with a varying number of operated pump units and selected rotational speed references. However, the optimization of the parallel-connected rotational speed controlled pump units often requires adaptive modelling of both parallel pump characteristics and the surrounding system in varying operation conditions. The available information required for the system modelling in typical parallel pumping applications such as waste water treatment and various cooling and water delivery pumping tasks can be limited, and the lack of real-time operation point monitoring often sets limits for accurate energy efficiency optimization. Hence, alternatives for easily implementable control strategies which can be adopted with minimum system data are necessary. This doctoral thesis concentrates on the methods that allow the energy efficient use of variable speed controlled parallel pumps in system scenarios in which the parallel pump units consist of a centrifugal pump, an electric motor, and a frequency converter. Firstly, the suitable operation conditions for variable speed controlled parallel pumps are studied. Secondly, methods for determining the output of each parallel pump unit using characteristic curve-based operation point estimation with frequency converter are discussed. Thirdly, the implementation of the control strategy based on real-time pump operation point estimation and sub-optimization of each parallel pump unit is studied. The findings of the thesis support the idea that the energy efficiency of the pumping can be increased without the installation of new, more efficient components in the systems by simply adopting suitable control strategies. An easily implementable and adaptive control strategy for variable speed controlled parallel pumping systems can be created by utilizing the pump operation point estimation available in modern frequency converters. Hence, additional real-time flow metering, start-up measurements, and detailed system model are unnecessary, and the pumping task can be fulfilled by determining a speed reference for each parallel-pump unit which suggests the energy efficient operation of the pumping system.
Resumo:
The Laboratory of Intelligent Machine researches and develops energy-efficient power transmissions and automation for mobile construction machines and industrial processes. The laboratory's particular areas of expertise include mechatronic machine design using virtual technologies and simulators and demanding industrial robotics. The laboratory has collaborated extensively with industrial actors and it has participated in significant international research projects, particularly in the field of robotics. For years, dSPACE tools were the lonely hardware which was used in the lab to develop different control algorithms in real-time. dSPACE's hardware systems are in widespread use in the automotive industry and are also employed in drives, aerospace, and industrial automation. But new competitors are developing new sophisticated systems and their features convinced the laboratory to test new products. One of these competitors is National Instrument (NI). In order to get to know the specifications and capabilities of NI tools, an agreement was made to test a NI evolutionary system. This system is used to control a 1-D hydraulic slider. The objective of this research project is to develop a control scheme for the teleoperation of a hydraulically driven manipulator, and to implement a control algorithm between human and machine interaction, and machine and task environment interaction both on NI and dSPACE systems simultaneously and to compare the results.
Resumo:
Demand for the use of energy systems, entailing high efficiency as well as availability to harness renewable energy sources, is a key issue in order to tackling the threat of global warming and saving natural resources. Organic Rankine cycle (ORC) technology has been identified as one of the most promising technologies in recovering low-grade heat sources and in harnessing renewable energy sources that cannot be efficiently utilized by means of more conventional power systems. The ORC is based on the working principle of Rankine process, but an organic working fluid is adopted in the cycle instead of steam. This thesis presents numerical and experimental results of the study on the design of small-scale ORCs. Two main applications were selected for the thesis: waste heat re- covery from small-scale diesel engines concentrating on the utilization of the exhaust gas heat and waste heat recovery in large industrial-scale engine power plants considering the utilization of both the high and low temperature heat sources. The main objective of this work was to identify suitable working fluid candidates and to study the process and turbine design methods that can be applied when power plants based on the use of non-conventional working fluids are considered. The computational work included the use of thermodynamic analysis methods and turbine design methods that were based on the use of highly accurate fluid properties. In addition, the design and loss mechanisms in supersonic ORC turbines were studied by means of computational fluid dynamics. The results indicated that the design of ORC is highly influenced by the selection of the working fluid and cycle operational conditions. The results for the turbine designs in- dicated that the working fluid selection should not be based only on the thermodynamic analysis, but requires also considerations on the turbine design. The turbines tend to be fast rotating, entailing small blade heights at the turbine rotor inlet and highly supersonic flow in the turbine flow passages, especially when power systems with low power outputs are designed. The results indicated that the ORC is a potential solution in utilizing waste heat streams both at high and low temperatures and both in micro and larger scale appli- cations.
Resumo:
This thesis presents briefly the basic operation and use of centrifugal pumps and parallel pumping applications. The characteristics of parallel pumping applications are compared to circuitry, in order to search analogy between these technical fields. The purpose of studying circuitry is to find out if common software tools for solving circuit performance could be used to observe parallel pumping applications. The empirical part of the thesis introduces a simulation environment for parallel pumping systems, which is based on circuit components of Matlab Simulink —software. The created simulation environment ensures the observation of variable speed controlled parallel pumping systems in case of different controlling methods. The introduced simulation environment was evaluated by building a simulation model for actual parallel pumping system at Lappeenranta University of Technology. The simulated performance of the parallel pumps was compared to measured values of the actual system. The gathered information shows, that if the initial data of the system and pump perfonnance is adequate, the circuitry based simulation environment can be exploited to observe parallel pumping systems. The introduced simulation environment can represent the actual operation of parallel pumps in reasonably accuracy. There by the circuitry based simulation can be used as a researching tool to develop new controlling ways for parallel pumps.
Resumo:
Prosessisimulointiohjelmistojen käyttö on yleistynyt paperiteollisuuden prosessien kartoituksessa ja kyseiset ohjelmistot ovat jo pitkään olleet myös Pöyry Engineering Oy:n työkaluja prosessisuunnittelussa. Tämän työn tavoitteeksi määritettiin prosessisimulointiohjelmistojen käytön selvittäminen suomalaisissa paperitehtaissa sekä prosessisimuloinnin tulevaisuuden näkymien arviointi metsäteollisuuden suunnittelupalveluissa liiketoiminnan kehittämiseksi. Työn teoriaosassa selvitetään mm. seuraavia asioita: mitä prosessisimulointi on, miksi simuloidaan ja mitkä ovat simuloinnin hyödyt ja haasteet. Teoriaosassa esitellään yleisimmät käytössä olevat prosessisimulointiohjelmistot, simulointiprosessin eteneminen sekä prosessisimuloinnin tuotteistamisen vaatimuksia. Työn kokeellisessa osassa selvitettiin kyselyn avulla prosessisimulointiohjelmistojen käyttöä Suomen paperitehtaissa. Kysely lähetettiin kaikille Suomen tärkeimmille paperitehtaille. Kyselyn avulla selvitettiin mm, mitä ohjelmia käytetään, mitä on simuloitu, mitä pitää vielä simuloida ja kuinka tarpeellisena prosessisimulointia pidetään. Työntulokset osoittavat, että kaikilla kyselyyn vastanneilla suomalaisilla paperitehtailla on käytetty prosessisimulointia. Suurin osa simuloinneista on tehty konelinjoihin sekä massa- ja vesijärjestelmiin. Tulevaisuuden tärkeimpänä kohteena pidetään energiavirtojen simulointia. Simulointimallien pitkäjänteisessä hyödyntämisessä ja ylläpidossa on kehitettävää, jossa simulointipalvelujen hankkiminen palveluna on tehtaille todennäköisin vaihtoehto. Johtopäätöksenä on se, että tehtailla on tarvetta prosessisimuloinnille. Ilmapiiri on kyselytuloksien mukaan suotuisa ja simulointi nähdään tarpeellisena työkaluna. Prosessisimuloinnin markkinointia, erillispalvelutuotteen lisäksi, kannattaisi kehittää siten, että simulointimallin ylläpito jatkuisi projektin jälkeen lähipalveluna. Markkinointi pitäisi tehdä jo projektin alkuvaiheessa tai projektin aikana. Simulointiohjelmien kirjosta suunnittelutoimiston kannattaa valita simulointiohjelmistoja, jotka sopivat sille parhaiten. Erityistapauksissa muiden ohjelmien hankintaa kannattaa harkita asiakkaan toivomusten mukaisesti.
Resumo:
Tämä diplomityö on tehty Andritz Oy:lle Washers & Filters tuoteryhmään. Työ on osa 3D-suunnittelujärjestelmän käyttöönottoprojektia. Tavoitteena on arvioida uuden laitesuunnittelujärjestelmän vaikutuksia yrityksen tietojärjestelmiin ja toimintatapoihin sekä etsiä potentiaalisia tulevaisuuden kehityskohteita. Suunnittelutietoa hyödyntäviä sidosryhmien edustajia haastattelemalla selvitettiin järjestelmille ja ohjelmistoille asetettavia vaatimuksia. Ohjelmistoihin tutustumalla saatiin käsitys niiden nykytilasta ja kehityssuunnista. 3D-geometrian hyödyntämiseen perustuvilla järjestelmillä voidaan poistaa päällekäistä työtä sekä lyhentää läpimenoaikoja suunnittelussa ja valmistuksessa sijoittamalla työvaiheita rinnakkain. Suurimmat 3D-suunnittelun edut saavutetaan tuotekehitysvaiheessa, tuotemuutoksia tehtäessä sekä valmistusprosesseja suunniteltaessa. Ongelmallisimmat osa-alueet tietojärjestelmien kehittämisessä ovat ensi-sijaisesti tiedonsiirto ohjelmistojen välillä, työntekijöiden muutosvastarinta sekä laadukkaiden järjestelmien korkea hinta. Laajojen tietojärjestelmä-projektien läpivienti on hyvin haastavaa ja onnistuminen vaatii kaikkien sidosryhmien mukana olemista ja tarkkaa projektin koordinointia.
Resumo:
The objective of the work has been to study why systems thinking should be used in combination with TQM, what are the main benefits of the integration and how it could best be done. The work analyzes the development of systems thinking and TQM with time and the main differences between them. The work defines prerequisites for adopting a systems approach and the organizational factors which embody the development of an efficient learning organization. The work proposes a model based on combination of an interactive management model and redesign to be used for application of systems approach with TQM in practice. The results of the work indicate that there are clear differences between systems thinking and TQM which justify their combination. Systems approach provides an additional complementary perspective to quality management. TQM is focused on optimizing operations at the operational level while interactive management and redesign of organization are focused on optimization operations at the conceptual level providing a holistic system for value generation. The empirical study demonstrates the applicability of the proposed model in one case study company but its application is tenable and possible also beyond this particular company. System dynamic modeling and other systems based techniques like cognitive mapping are useful methods for increasing understanding and learning about the behavior of systems. The empirical study emphasizes the importance of using a proper early warning system.
Resumo:
Computational model-based simulation methods were developed for the modelling of bioaffinity assays. Bioaffinity-based methods are widely used to quantify a biological substance in biological research, development and in routine clinical in vitro diagnostics. Bioaffinity assays are based on the high affinity and structural specificity between the binding biomolecules. The simulation methods developed are based on the mechanistic assay model, which relies on the chemical reaction kinetics and describes the forming of a bound component as a function of time from the initial binding interaction. The simulation methods were focused on studying the behaviour and the reliability of bioaffinity assay and the possibilities the modelling methods of binding reaction kinetics provide, such as predicting assay results even before the binding reaction has reached equilibrium. For example, a rapid quantitative result from a clinical bioaffinity assay sample can be very significant, e.g. even the smallest elevation of a heart muscle marker reveals a cardiac injury. The simulation methods were used to identify critical error factors in rapid bioaffinity assays. A new kinetic calibration method was developed to calibrate a measurement system by kinetic measurement data utilizing only one standard concentration. A nodebased method was developed to model multi-component binding reactions, which have been a challenge to traditional numerical methods. The node-method was also used to model protein adsorption as an example of nonspecific binding of biomolecules. These methods have been compared with the experimental data from practice and can be utilized in in vitro diagnostics, drug discovery and in medical imaging.
Resumo:
Rapid ongoing evolution of multiprocessors will lead to systems with hundreds of processing cores integrated in a single chip. An emerging challenge is the implementation of reliable and efficient interconnection between these cores as well as other components in the systems. Network-on-Chip is an interconnection approach which is intended to solve the performance bottleneck caused by traditional, poorly scalable communication structures such as buses. However, a large on-chip network involves issues related to congestion problems and system control, for instance. Additionally, faults can cause problems in multiprocessor systems. These faults can be transient faults, permanent manufacturing faults, or they can appear due to aging. To solve the emerging traffic management, controllability issues and to maintain system operation regardless of faults a monitoring system is needed. The monitoring system should be dynamically applicable to various purposes and it should fully cover the system under observation. In a large multiprocessor the distances between components can be relatively long. Therefore, the system should be designed so that the amount of energy-inefficient long-distance communication is minimized. This thesis presents a dynamically clustered distributed monitoring structure. The monitoring is distributed so that no centralized control is required for basic tasks such as traffic management and task mapping. To enable extensive analysis of different Network-on-Chip architectures, an in-house SystemC based simulation environment was implemented. It allows transaction level analysis without time consuming circuit level implementations during early design phases of novel architectures and features. The presented analysis shows that the dynamically clustered monitoring structure can be efficiently utilized for traffic management in faulty and congested Network-on-Chip-based multiprocessor systems. The monitoring structure can be also successfully applied for task mapping purposes. Furthermore, the analysis shows that the presented in-house simulation environment is flexible and practical tool for extensive Network-on-Chip architecture analysis.
Resumo:
Kapasitiivinen mittaustekniikka perustuu anturin ja kohteen välisen kapasitanssin muutok-seen: kun kapasitanssi muuttuu, muuttuu myös anturin impedanssi. Tätä yhteyttä hyödyn-tämällä voidaan tuottaa mittaussignaali muuttuvasta parametrista. Tässä työssä esitellään lyhyesti pienen välimatkan tarkkaan paikanmittaukseen käytettäviä tekniikoita ja selvitetään kapasitiivisten paikanmittausanturien perusominaisuuksia sekä käytännön toteutukseen vaadittavia asioita lähdemateriaalin ja simuloinnin avulla. Lisäksi tämän hetken kaupallisia eri tekniikoihin perustuvia mittausjärjestelmiä vertaillaan keskenään. Vertailun perusteella kapasitiiviset mittausjärjestelmät tarjoavat korkeimman mittaustark-kuuden lyhyellä mittausalueella, kun mittausympäristö ja kohde on kapasitiiviselle anturille soveltuva. Induktiiviset anturit tarjoavat suuremman mittauskaistanleveyden ja soveltuvat kapasitiivisia antureita paremmin likaisiin ympäristöihin. Optiset järjestelmät mahdollistavat puolestaan suuremman mittausalueen.
Resumo:
This thesis is a literature study that develops a conceptual model of decision making and decision support in service systems. The study is related to the Ä-Logi, Intelligent Service Logic for Welfare Sector Services research project, and the objective of the study is to develop the necessary theoretical framework to enable further research based on the research project results and material. The study first examines the concepts of service and service systems, focusing on understanding the characteristics of service systems and their implications for decision making and decision support to provide the basis for the development of the conceptual model. Based on the identified service system characteristics, an integrated model of service systems is proposed that views service systems through a number of interrelated perspectives that each offer different, but complementary, implications on the nature of decision making and the requirements for decision support in service systems. Based on the model, it is proposed that different types of decision making contexts can be identified in service systems that may be dominated by different types of decision making processes and where different types of decision support may be required, depending on the characteristics of the decision making context and its decision making processes. The proposed conceptual model of decision making and decision support in service systems examines the characteristics of decision making contexts and processes in service systems, and their typical requirements for decision support. First, a characterization of different types of decision making contexts in service systems is proposed based on the Cynefin framework and the identified service system characteristics. Second, the nature of decision making processes in service systems is proposed to be dual, with both rational and naturalistic decision making processes existing in service systems, and having an important and complementary role in decision making in service systems. Finally, a characterization of typical requirements for decision support in service systems is proposed that examines the decision support requirements associated with different types of decision making processes in characteristically different types of decision making contexts. It is proposed that decision support for the decision making processes that are based on rational decision making can be based on organizational decision support models, while decision support for the decision making processes that are based on naturalistic decision making should be based on supporting the decision makers’ situation awareness and facilitating the development of their tacit knowledge of the system and its tasks. Based on the proposed conceptual model a further research process is proposed. The study additionally provides a number of new perspectives on the characteristics of service systems, and the nature of decision making and requirements for decision support in service systems that can potentially provide a basis for further discussion and research, and support the practice alike.
Resumo:
Digital business ecosystems (DBE) are becoming an increasingly popular concept for modelling and building distributed systems in heterogeneous, decentralized and open environments. Information- and communication technology (ICT) enabled business solutions have created an opportunity for automated business relations and transactions. The deployment of ICT in business-to-business (B2B) integration seeks to improve competitiveness by establishing real-time information and offering better information visibility to business ecosystem actors. The products, components and raw material flows in supply chains are traditionally studied in logistics research. In this study, we expand the research to cover the processes parallel to the service and information flows as information logistics integration. In this thesis, we show how better integration and automation of information flows enhance the speed of processes and, thus, provide cost savings and other benefits for organizations. Investments in DBE are intended to add value through business automation and are key decisions in building up information logistics integration. Business solutions that build on automation are important sources of value in networks that promote and support business relations and transactions. Value is created through improved productivity and effectiveness when new, more efficient collaboration methods are discovered and integrated into DBE. Organizations, business networks and collaborations, even with competitors, form DBE in which information logistics integration has a significant role as a value driver. However, traditional economic and computing theories do not focus on digital business ecosystems as a separate form of organization, and they do not provide conceptual frameworks that can be used to explore digital business ecosystems as value drivers—combined internal management and external coordination mechanisms for information logistics integration are not the current practice of a company’s strategic process. In this thesis, we have developed and tested a framework to explore the digital business ecosystems developed and a coordination model for digital business ecosystem integration; moreover, we have analysed the value of information logistics integration. The research is based on a case study and on mixed methods, in which we use the Delphi method and Internetbased tools for idea generation and development. We conducted many interviews with key experts, which we recoded, transcribed and coded to find success factors. Qualitative analyses were based on a Monte Carlo simulation, which sought cost savings, and Real Option Valuation, which sought an optimal investment program for the ecosystem level. This study provides valuable knowledge regarding information logistics integration by utilizing a suitable business process information model for collaboration. An information model is based on the business process scenarios and on detailed transactions for the mapping and automation of product, service and information flows. The research results illustrate the current cap of understanding information logistics integration in a digital business ecosystem. Based on success factors, we were able to illustrate how specific coordination mechanisms related to network management and orchestration could be designed. We also pointed out the potential of information logistics integration in value creation. With the help of global standardization experts, we utilized the design of the core information model for B2B integration. We built this quantitative analysis by using the Monte Carlo-based simulation model and the Real Option Value model. This research covers relevant new research disciplines, such as information logistics integration and digital business ecosystems, in which the current literature needs to be improved. This research was executed by high-level experts and managers responsible for global business network B2B integration. However, the research was dominated by one industry domain, and therefore a more comprehensive exploration should be undertaken to cover a larger population of business sectors. Based on this research, the new quantitative survey could provide new possibilities to examine information logistics integration in digital business ecosystems. The value activities indicate that further studies should continue, especially with regard to the collaboration issues on integration, focusing on a user-centric approach. We should better understand how real-time information supports customer value creation by imbedding the information into the lifetime value of products and services. The aim of this research was to build competitive advantage through B2B integration to support a real-time economy. For practitioners, this research created several tools and concepts to improve value activities, information logistics integration design and management and orchestration models. Based on the results, the companies were able to better understand the formulation of the digital business ecosystem and the importance of joint efforts in collaboration. However, the challenge of incorporating this new knowledge into strategic processes in a multi-stakeholder environment remains. This challenge has been noted, and new projects have been established in pursuit of a real-time economy.
Resumo:
The advancement of science and technology makes it clear that no single perspective is any longer sufficient to describe the true nature of any phenomenon. That is why the interdisciplinary research is gaining more attention overtime. An excellent example of this type of research is natural computing which stands on the borderline between biology and computer science. The contribution of research done in natural computing is twofold: on one hand, it sheds light into how nature works and how it processes information and, on the other hand, it provides some guidelines on how to design bio-inspired technologies. The first direction in this thesis focuses on a nature-inspired process called gene assembly in ciliates. The second one studies reaction systems, as a modeling framework with its rationale built upon the biochemical interactions happening within a cell. The process of gene assembly in ciliates has attracted a lot of attention as a research topic in the past 15 years. Two main modelling frameworks have been initially proposed in the end of 1990s to capture ciliates’ gene assembly process, namely the intermolecular model and the intramolecular model. They were followed by other model proposals such as templatebased assembly and DNA rearrangement pathways recombination models. In this thesis we are interested in a variation of the intramolecular model called simple gene assembly model, which focuses on the simplest possible folds in the assembly process. We propose a new framework called directed overlap-inclusion (DOI) graphs to overcome the limitations that previously introduced models faced in capturing all the combinatorial details of the simple gene assembly process. We investigate a number of combinatorial properties of these graphs, including a necessary property in terms of forbidden induced subgraphs. We also introduce DOI graph-based rewriting rules that capture all the operations of the simple gene assembly model and prove that they are equivalent to the string-based formalization of the model. Reaction systems (RS) is another nature-inspired modeling framework that is studied in this thesis. Reaction systems’ rationale is based upon two main regulation mechanisms, facilitation and inhibition, which control the interactions between biochemical reactions. Reaction systems is a complementary modeling framework to traditional quantitative frameworks, focusing on explicit cause-effect relationships between reactions. The explicit formulation of facilitation and inhibition mechanisms behind reactions, as well as the focus on interactions between reactions (rather than dynamics of concentrations) makes their applicability potentially wide and useful beyond biological case studies. In this thesis, we construct a reaction system model corresponding to the heat shock response mechanism based on a novel concept of dominance graph that captures the competition on resources in the ODE model. We also introduce for RS various concepts inspired by biology, e.g., mass conservation, steady state, periodicity, etc., to do model checking of the reaction systems based models. We prove that the complexity of the decision problems related to these properties varies from P to NP- and coNP-complete to PSPACE-complete. We further focus on the mass conservation relation in an RS and introduce the conservation dependency graph to capture the relation between the species and also propose an algorithm to list the conserved sets of a given reaction system.
Resumo:
The majority of research work carried out in the field of Operations-Research uses methods and algorithms to optimize the pick-up and delivery problem. Most studies aim to solve the vehicle routing problem, to accommodate optimum delivery orders, vehicles etc. This paper focuses on green logistics approach, where existing Public Transport infrastructure capability of a city is used for the delivery of small and medium sized packaged goods thus, helping improve the situation of urban congestion and greenhouse gas emissions reduction. It carried out a study to investigate the feasibility of the proposed multi-agent based simulation model, for efficiency of cost, time and energy consumption. Multimodal Dijkstra Shortest Path algorithm and Nested Monte Carlo Search have been employed for a two-phase algorithmic approach used for generation of time based cost matrix. The quality of the tour is dependent on the efficiency of the search algorithm implemented for plan generation and route planning. The results reveal a definite advantage of using Public Transportation over existing delivery approaches in terms of energy efficiency.