33 resultados para Distributed systems, modeling, composites, finite elements
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
The goal of this thesis is to define and validate a software engineering approach for the development of a distributed system for the modeling of composite materials, based on the analysis of various existing software development methods. We reviewed the main features of: (1) software engineering methodologies; (2) distributed system characteristics and their effect on software development; (3) composite materials modeling activities and the requirements for the software development. Using the design science as a research methodology, the distributed system for creating models of composite materials is created and evaluated. Empirical experiments which we conducted showed good convergence of modeled and real processes. During the study, we paid attention to the matter of complexity and importance of distributed system and a deep understanding of modern software engineering methods and tools.
Resumo:
The focus of this dissertation is to develop finite elements based on the absolute nodal coordinate formulation. The absolute nodal coordinate formulation is a nonlinear finite element formulation, which is introduced for special requirements in the field of flexible multibody dynamics. In this formulation, a special definition for the rotation of elements is employed to ensure the formulation will not suffer from singularities due to large rotations. The absolute nodal coordinate formulation can be used for analyzing the dynamics of beam, plate and shell type structures. The improvements of the formulation are mainly concentrated towards the description of transverse shear deformation. Additionally, the formulation is verified by using conventional iso-parametric solid finite element and geometrically exact beam theory. Previous claims about especially high eigenfrequencies are studied by introducing beam elements based on the absolute nodal coordinate formulation in the framework of the large rotation vector approach. Additionally, the same high eigenfrequency problem is studied by using constraints for transverse deformation. It was determined that the improvements for shear deformation in the transverse direction lead to clear improvements in computational efficiency. This was especially true when comparative stress must be defined, for example when using elasto-plastic material. Furthermore, the developed plate element can be used to avoid certain numerical problems, such as shear and curvature lockings. In addition, it was shown that when compared to conventional solid elements, or elements based on nonlinear beam theory, elements based on the absolute nodal coordinate formulation do not lead to an especially stiff system for the equations of motion.
Resumo:
The absolute nodal coordinate formulation was originally developed for the analysis of structures undergoing large rotations and deformations. This dissertation proposes several enhancements to the absolute nodal coordinate formulation based finite beam and plate elements. The main scientific contribution of this thesis relies on the development of elements based on the absolute nodal coordinate formulation that do not suffer from commonly known numerical locking phenomena. These elements can be used in the future in a number of practical applications, for example, analysis of biomechanical soft tissues. This study presents several higher-order Euler–Bernoulli beam elements, a simple method to alleviate Poisson’s and transverse shear locking in gradient deficient plate elements, and a nearly locking free gradient deficient plate element. The absolute nodal coordinate formulation based gradient deficient plate elements developed in this dissertation describe most of the common numerical locking phenomena encountered in the formulation of a continuum mechanics based description of elastic energy. Thus, with these fairly straightforwardly formulated elements that are comprised only of the position and transverse direction gradient degrees of freedom, the pathologies and remedies for the numerical locking phenomena are presented in a clear and understandable manner. The analysis of the Euler–Bernoulli beam elements developed in this study show that the choice of higher gradient degrees of freedom as nodal degrees of freedom leads to a smoother strain field. This improves the rate of convergence.
Resumo:
Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.
Resumo:
The past few decades have seen a considerable increase in the number of parallel and distributed systems. With the development of more complex applications, the need for more powerful systems has emerged and various parallel and distributed environments have been designed and implemented. Each of the environments, including hardware and software, has unique strengths and weaknesses. There is no single parallel environment that can be identified as the best environment for all applications with respect to hardware and software properties. The main goal of this thesis is to provide a novel way of performing data-parallel computation in parallel and distributed environments by utilizing the best characteristics of difference aspects of parallel computing. For the purpose of this thesis, three aspects of parallel computing were identified and studied. First, three parallel environments (shared memory, distributed memory, and a network of workstations) are evaluated to quantify theirsuitability for different parallel applications. Due to the parallel and distributed nature of the environments, networks connecting the processors in these environments were investigated with respect to their performance characteristics. Second, scheduling algorithms are studied in order to make them more efficient and effective. A concept of application-specific information scheduling is introduced. The application- specific information is data about the workload extractedfrom an application, which is provided to a scheduling algorithm. Three scheduling algorithms are enhanced to utilize the application-specific information to further refine their scheduling properties. A more accurate description of the workload is especially important in cases where the workunits are heterogeneous and the parallel environment is heterogeneous and/or non-dedicated. The results obtained show that the additional information regarding the workload has a positive impact on the performance of applications. Third, a programming paradigm for networks of symmetric multiprocessor (SMP) workstations is introduced. The MPIT programming paradigm incorporates the Message Passing Interface (MPI) with threads to provide a methodology to write parallel applications that efficiently utilize the available resources and minimize the overhead. The MPIT allows for communication and computation to overlap by deploying a dedicated thread for communication. Furthermore, the programming paradigm implements an application-specific scheduling algorithm. The scheduling algorithm is executed by the communication thread. Thus, the scheduling does not affect the execution of the parallel application. Performance results achieved from the MPIT show that considerable improvements over conventional MPI applications are achieved.
Resumo:
Today’s commercial web sites are under heavy user load and they are expected to be operational and available at all times. Distributed system architectures have been developed to provide a scalable and failure tolerant high availability platform for these web based services. The focus on this thesis was to specify and implement resilient and scalable locally distributed high availability system architecture for a web based service. Theory part concentrates on the fundamental characteristics of distributed systems and presents common scalable high availability server architectures that are used in web based services. In the practical part of the thesis the implemented new system architecture is explained. Practical part also includes two different test cases that were done to test the system's performance capacity.
Resumo:
Työn tavoitteena oli toteuttaa tietojärjestelmä maito- ja maitotuotetilastoinnin tarpeisiin. Tietojärjestelmän tulee tukea lähes koko tilastotuotantoprosessia tallentamisesta raportointiin. Tietojärjestelmän tarpeet tulivat vaatimusmäärittelystä ja ne piti yhdistää tietohallinnon linjauksiin. Tietojärjestelmä tehdä kustannustehokkaasti tietyssä aikataulussa. Lisäksi tuli luoda käytäntöjä tuleville tilastotietojärjestelmille. Työn teoriaosan aluksi käsitellään tilastotutkimuksen perusteita ja tilastoviranomaisvaatimuksia. Tästä edetään käytännön tilastotuotantoprosessiin ja sen järjestelmävaatimuksiin. Teoriaosan loppupuoliskossa käydään läpi ohjelmistotuotantoprosessi ja tietojärjestelmän suunnittelun peruskäsitteitä. Käytännön osassa puretaan vaatimukset ja ongelma-alue. Sitten analysoidaan eri ratkaisuvaihtoehtoja. Niistä päädytään toteutusratkaisuun, jonka tuloksia tarkastellaan projektin lopputulosten ja kahden vuoden käyttökokemusten perusteella. Tietojärjestelmä toteutettiin onnistuneesti ja se mahdollistaa taloudellisen sekä laadukkaan maito- ja maitotuotetilastoinnin Suomessa.
Resumo:
Korkeasaatavuus on olennainen osa nykyaikaisissa, integroiduissa yritysjärjestelmissä. Yritysten kansainvälistyessä tiedon on oltava saatavissa ympärivuorokautisesti, mikä asettaa yhä kovempia vaatimuksia järjestelmän yksittäisten osien saatavuudelle. Kasvava tietojärjestelmäintegraatio puolestaan tekee järjestelmän solmukohdista kriittisiä liiketoiminnan kannalta. Tässä työssä perehdytään hajautettujen järjestelmien ominaisuuksiin ja niiden asettamiin haasteisiin. Esiteltyjä teknologioita ovat muun muassa väliohjelmistot, klusterit ja kuormantasaus. Yrityssovellusten pohjana käytetty Java 2 Enterprise Edition (J2EE) -teknologia käsitellään olennaisilta osiltaan. Työssä käytetään sovelluspalvelinalustana BEA WebLogic Server -ohjelmistoa, jonka ominaisuudet käydään läpi hajautuksen kannalta. Työn käytännön osuudessa toteutetaan kahdelle erilaiselle olemassa olevalle yrityssovellukselle korkean saatavuuden sovelluspalvelinympäristö, joissa sovellusten asettamat rajoitukset on otettu huomioon.
Resumo:
The main objective of this thesis was to analyze the usability of registers and indexes of electronic marketplaces. The work is focused on UDDI-based electronic marketplaces, which are standardized by the W3C. UDDI-registers are usable in intranets, extranets and in Internet. Using UDDI-registers Web-services can be searched in many ways, including alphabetical and domain specific searches. Humans and machines can use the features UDDI-registers. The thesis deals the design principles, architectures and specifications of UDDI-registers. In addition, the thesis includes the design and the specifications of an electronic marketplace developed for supporting electronic logistics services.
Resumo:
The aim of the work is to study the existing analytical calculation procedures found in literature to calculate the eddy-current losses in surface mounted permanent magnets within PMSM application. The most promising algorithms are implemented with MATLAB software under the dimensional data of LUT prototype machine. In addition finite elements analyze, utilized with help of Flux 2D software from Cedrat Ltd, is applied to calculate the eddy-current losses in permanent magnets. The results obtained from analytical methods are compared with numerical results.
Resumo:
Tässä diplomityössä tutkitaan kuinka verkonvalvonta voidaan toteuttaa hajautetussa järjestelmässä. Työssä perehdytään tavallisten tietojärjestelmien ja hajautettujen järjestelmien eroihin, kyseisten järjestelmien ominaispiirteisiin sekä käsitellään mitä verkonvalvonta on yleisellä tasolla ja miten se on yleensä toteutettu tavallisissa tietojärjestelmissä. Tutkitaan tarkemmin kuinka verkonvalvonta voidaan toteuttaa tehokkaasti hajautetussa järjestelmässä sekä mitä vaatimuksia ja haasteita verkonvalvonnassa esiintyy. Tutkimukseen valittiin myös kaksi hajautetun järjestelmän verkonvalvontaan kehitettyä valvontaohjelmistoa sekä yksi laitteistopohjainen ratkaisu joita tutkitaan ja vertaillaan tarkemmin.Selvitetään onko yrityksen kannattavaa ja valvonnan kannalta tehokasta ottaa tämänkaltaista järjestelmää käyttöön. Lopputuloksena työssä on esiteltyinä kuinka verkonvalvonta voidaan toteuttaa hajautetussa järjestelmässä ja miten olemassa olevat haasteet voidaan ratkaista. Toteutusvaihtoehdot tutkittiin ja niistä valittiin paras vaihtoehto (perfSONAR) toteutustavaksi kohdeorganisaation asiakasverkkoyhteyksien valvontaan. Lopuksi esitellään toteutussuunnitelma yrityksen asiakasyhteyksien valvomiseen tarkoitetulle verkonvalvonnalle.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Digital business ecosystems (DBE) are becoming an increasingly popular concept for modelling and building distributed systems in heterogeneous, decentralized and open environments. Information- and communication technology (ICT) enabled business solutions have created an opportunity for automated business relations and transactions. The deployment of ICT in business-to-business (B2B) integration seeks to improve competitiveness by establishing real-time information and offering better information visibility to business ecosystem actors. The products, components and raw material flows in supply chains are traditionally studied in logistics research. In this study, we expand the research to cover the processes parallel to the service and information flows as information logistics integration. In this thesis, we show how better integration and automation of information flows enhance the speed of processes and, thus, provide cost savings and other benefits for organizations. Investments in DBE are intended to add value through business automation and are key decisions in building up information logistics integration. Business solutions that build on automation are important sources of value in networks that promote and support business relations and transactions. Value is created through improved productivity and effectiveness when new, more efficient collaboration methods are discovered and integrated into DBE. Organizations, business networks and collaborations, even with competitors, form DBE in which information logistics integration has a significant role as a value driver. However, traditional economic and computing theories do not focus on digital business ecosystems as a separate form of organization, and they do not provide conceptual frameworks that can be used to explore digital business ecosystems as value drivers—combined internal management and external coordination mechanisms for information logistics integration are not the current practice of a company’s strategic process. In this thesis, we have developed and tested a framework to explore the digital business ecosystems developed and a coordination model for digital business ecosystem integration; moreover, we have analysed the value of information logistics integration. The research is based on a case study and on mixed methods, in which we use the Delphi method and Internetbased tools for idea generation and development. We conducted many interviews with key experts, which we recoded, transcribed and coded to find success factors. Qualitative analyses were based on a Monte Carlo simulation, which sought cost savings, and Real Option Valuation, which sought an optimal investment program for the ecosystem level. This study provides valuable knowledge regarding information logistics integration by utilizing a suitable business process information model for collaboration. An information model is based on the business process scenarios and on detailed transactions for the mapping and automation of product, service and information flows. The research results illustrate the current cap of understanding information logistics integration in a digital business ecosystem. Based on success factors, we were able to illustrate how specific coordination mechanisms related to network management and orchestration could be designed. We also pointed out the potential of information logistics integration in value creation. With the help of global standardization experts, we utilized the design of the core information model for B2B integration. We built this quantitative analysis by using the Monte Carlo-based simulation model and the Real Option Value model. This research covers relevant new research disciplines, such as information logistics integration and digital business ecosystems, in which the current literature needs to be improved. This research was executed by high-level experts and managers responsible for global business network B2B integration. However, the research was dominated by one industry domain, and therefore a more comprehensive exploration should be undertaken to cover a larger population of business sectors. Based on this research, the new quantitative survey could provide new possibilities to examine information logistics integration in digital business ecosystems. The value activities indicate that further studies should continue, especially with regard to the collaboration issues on integration, focusing on a user-centric approach. We should better understand how real-time information supports customer value creation by imbedding the information into the lifetime value of products and services. The aim of this research was to build competitive advantage through B2B integration to support a real-time economy. For practitioners, this research created several tools and concepts to improve value activities, information logistics integration design and management and orchestration models. Based on the results, the companies were able to better understand the formulation of the digital business ecosystem and the importance of joint efforts in collaboration. However, the challenge of incorporating this new knowledge into strategic processes in a multi-stakeholder environment remains. This challenge has been noted, and new projects have been established in pursuit of a real-time economy.
Resumo:
In this thesis work, a strength analysis is made for a boat trailer. The studied trailer structure is manufactured from Ruukki’s structural steel S420. The main focus in this work is in the trailer’s frame. The investigation process consists two main stages. These stages are strain gage measurements and finite elements analysis. Strain gage measurements were performed to the current boat trailer in February 2015. Static durability and fatigue life of the trailer are analyzed with finite element analysis and with two different materials. These materials are the current trailer material Ruukki’s structural steel S420 and new option material high strength precision tube Form 800. The main target by using high strength steel in a trailer is weight reduction. The applied fatigue analysis methods are effective notch stress and structural hot spot stress approaches. The target of these strength analyses is to determine if it is reasonable to change the trailer material to high strength steel. The static strengths of the S420 and Form 800 trailers is sufficient. The fatigue strength of the Form 800 trailer is considerably lower than the fatigue strength of the S420 trailer. For future research, the effect of hot dip galvanization to the high strength steel has to be investigated. The effect of hot dip galvanization to the trailer is investigated by laboratory tests that are not included in this thesis.