77 resultados para DDM Data Distribution Management testbed benchmark design implementation instance generator
Resumo:
Due to various advantages such as flexibility, scalability and updatability, software intensive systems are increasingly embedded in everyday life. The constantly growing number of functions executed by these systems requires a high level of performance from the underlying platform. The main approach to incrementing performance has been the increase of operating frequency of a chip. However, this has led to the problem of power dissipation, which has shifted the focus of research to parallel and distributed computing. Parallel many-core platforms can provide the required level of computational power along with low power consumption. On the one hand, this enables parallel execution of highly intensive applications. With their computational power, these platforms are likely to be used in various application domains: from home use electronics (e.g., video processing) to complex critical control systems. On the other hand, the utilization of the resources has to be efficient in terms of performance and power consumption. However, the high level of on-chip integration results in the increase of the probability of various faults and creation of hotspots leading to thermal problems. Additionally, radiation, which is frequent in space but becomes an issue also at the ground level, can cause transient faults. This can eventually induce a faulty execution of applications. Therefore, it is crucial to develop methods that enable efficient as well as resilient execution of applications. The main objective of the thesis is to propose an approach to design agentbased systems for many-core platforms in a rigorous manner. When designing such a system, we explore and integrate various dynamic reconfiguration mechanisms into agents functionality. The use of these mechanisms enhances resilience of the underlying platform whilst maintaining performance at an acceptable level. The design of the system proceeds according to a formal refinement approach which allows us to ensure correct behaviour of the system with respect to postulated properties. To enable analysis of the proposed system in terms of area overhead as well as performance, we explore an approach, where the developed rigorous models are transformed into a high-level implementation language. Specifically, we investigate methods for deriving fault-free implementations from these models into, e.g., a hardware description language, namely VHDL.
Resumo:
SD card (Secure Digital Memory Card) is widely used in portable storage medium. Currently, latest researches on SD card, are mainly SD card controller based on FPGA (Field Programmable Gate Array). Most of them are relying on API interface (Application Programming Interface), AHB bus (Advanced High performance Bus), etc. They are dedicated to the realization of ultra high speed communication between SD card and upper systems. Studies about SD card controller, really play a vital role in the field of high speed cameras and other sub-areas of expertise. This design of FPGA-based file systems and SD2.0 IP (Intellectual Property core) does not only exhibit a nice transmission rate, but also achieve the systematic management of files, while retaining a strong portability and practicality. The file system design and implementation on a SD card covers the main three IP innovation points. First, the combination and integration of file system and SD card controller, makes the overall system highly integrated and practical. The popular SD2.0 protocol is implemented for communication channels. Pure digital logic design based on VHDL (Very-High-Speed Integrated Circuit Hardware Description Language), integrates the SD card controller in hardware layer and the FAT32 file system for the entire system. Secondly, the document management system mechanism makes document processing more convenient and easy. Especially for small files in batch processing, it can ease the pressure of upper system to frequently access and process them, thereby enhancing the overall efficiency of systems. Finally, digital design ensures the superior performance. For transmission security, CRC (Cyclic Redundancy Check) algorithm is for data transmission protection. Design of each module is platform-independent of macro cells, and keeps a better portability. Custom integrated instructions and interfaces may facilitate easily to use. Finally, the actual test went through multi-platform method, Xilinx and Altera FPGA developing platforms. The timing simulation and debugging of each module was covered. Finally, Test results show that the designed FPGA-based file system IP on SD card can support SD card, TF card and Micro SD with 2.0 protocols, and the successful implementation of systematic management for stored files, and supports SD bus mode. Data read and write rates in Kingston class10 card is approximately 24.27MB/s and 16.94MB/s.
Resumo:
Diplomityössä on tutkittu reaaliaikaisen toimintolaskennan toteuttamista suomalaisen lasersiruja valmistavan PK-yrityksen tietojärjestelmään. Lisäksi on tarkasteltu toimintolaskennan vaikutuksia operatiiviseen toimintaan sekä toimintojen johtamiseen. Työn kirjallisuusosassa on käsitelty kirjallisuuslähteiden perusteella toimintolaskennan teorioita, laskentamenetelmiä sekä teknisessä toteutuksessa käytettyjä teknologioita. Työn toteutusosassa suunniteltiin ja toteutettiin WWW-pohjainen toimintolaskentajärjestelmä case-yrityksen kustannuslaskennan sekä taloushallinnon avuksi. Työkalu integroitiin osaksi yrityksen toiminnanohjaus- sekä valmistuksenohjausjärjestelmää. Perinteisiin toimintolaskentamallien tiedonkeruujärjestelmiin verrattuna case-yrityksessä syötteet toimintolaskentajärjestelmälle tulevat reaaliaikaisesti osana suurempaa tietojärjestelmäintegraatiota.Diplomityö pyrkii luomaan suhteen toimintolaskennan vaatimusten ja tietokantajärjestelmien välille. Toimintolaskentajärjestelmää yritys voi hyödyntää esimerkiksi tuotteiden hinnoittelussa ja kustannuslaskennassa näkemällä tuotteisiin liittyviä kustannuksia eri näkökulmista. Päätelmiä voidaan tehdä tarkkaan kustannusinformaatioon perustuen sekä määrittää järjestelmän tuottaman datan perusteella, onko tietyn projektin, asiakkuuden tai tuotteen kehittäminen taloudellisesti kannattavaa.
Resumo:
The main objective of the Thesis is the description of the electricity distribution networks in Saint-Petersburg area and Stockholm as well. Main similarity and differences in the construction and technicalperformance are presented in the study. Present and future development and investment into the electricity distribution network of OJSC Lenenergo are viewed. The Thesis presents the overview of the power industry reform in Russia. The current state of the electricity distribution sector is described. The study views the participation of the foreign investor "Fortum Power and Heat Oy" inthe development and management of the OJSC Lenenergo. Benchmark comparison of the prices and tangible assets of the main electricity distribution companies in Saint-Petersburg and Stockholm areas is done.
Resumo:
Kiihtyvä kilpailu yritysten välillä on tuonut yritykset vaikeidenhaasteiden eteen. Tuotteet pitäisi saada markkinoille nopeammin, uusien tuotteiden pitäisi olla parempia kuin vanhojen ja etenkin parempia kuin kilpailijoiden vastaavat tuotteet. Lisäksi tuotteiden suunnittelu-, valmistus- ja muut kustannukset eivät saisi olla suuria. Näiden haasteiden toteuttamisessa yritetään usein käyttää apuna tuotetietoja, niiden hallintaa ja vaihtamista. Andritzin, kuten muidenkin yritysten, on otettava nämä asiat huomioon pärjätäkseen kilpailussa. Tämä työ on tehty Andritzille, joka on maailman johtavia paperin ja sellun valmistukseen tarkoitettujen laitteiden valmistajia ja huoltopalveluiden tarjoajia. Andritz on ottamassa käyttöön ERP-järjestelmän kaikissa toimipisteissään. Sitä halutaan hyödyntää mahdollisimman tehokkaasti, joten myös tuotetiedot halutaan järjestelmään koko elinkaaren ajalta. Osan tuotetiedoista luo Andritzin kumppanit ja alihankkijat, joten myös tietojen vaihto partnereiden välillä halutaan hoitaasiten, että tiedot saadaan suoraan ERP-järjestelmään. Tämän työn tavoitteena onkin löytää ratkaisu, jonka avulla Andritzin ja sen kumppaneiden välinen tietojenvaihto voidaan hoitaa. Tämä diplomityö esittelee tuotetietojen, niiden hallinnan ja vaihtamisen tarkoituksen ja tärkeyden. Työssä esitellään erilaisia ratkaisuvaihtoehtoja tiedonvaihtojärjestelmän toteuttamiseksi. Osa niistä perustuu yleisiin ja toimialakohtaisiin standardeihin. Myös kaksi kaupallista tuotetta esitellään. Tarkasteltavana onseuraavat standardit: PaperIXI, papiNet, X-OSCO, PSK-standardit sekä RosettaNet. Lisäksi työssä tarkastellaan ERP-järjestelmän toimittajan, SAP:in ratkaisuja tietojenvaihtoon. Näistä vaihtoehdoista parhaimpia tarkastellaan vielä yksityiskohtaisemmin ja lopuksi eri ratkaisuja vertaillaan keskenään, jotta löydettäisiin Andritzin tarpeisiin paras vaihtoehto.
Resumo:
Tämän diplomityön tavoitteena on vertailla maailmanlaajuisen sähköalan yhtiön kanavapartneriohjelmaa yhtiön kilpailijoiden vastaaviin ohjelmiin, sekä laatia yhtiön käyttöön konkreettinen työkalu kanavapartneriohjelmien vertailua varten. Tavoitteena on myös tutkimuksessa kerätyn tiedon perusteella selvittää yhtiön kanavapartneriohjelman vahvuudet ja heikkoudet. Tässä diplomityössä tutkimusongelmaa on ensin tarkasteltu kirjallisuuden valossa, keskittyen kirjallisuuteen kilpailijavertailusta sekä jakelukanavista. Kilpailijavertailu, benchmark,on kuvattu tässä yhteydessä osana laadunhallintaa, painottaen yleisesti käytössä olevaa Campin 10 askeleen kilpailijavertailuprosessia. Tässä tutkimuksessa tarkasteltavat jakelukanavateoriat on jaoteltu kahteen osaan; jakelukanavan rakennetta käsitteleviin teorioihin sekä jakelukanavan hallintaa käsitteleviin teorioihin. Ensin mainitussa keskitytään lähinnä jakelukanavamalleihin ja -tyyppeihin, ja toisessa lähemmin partneri -käsitteeseen; kumppanuuteen, kanavapartnereihin ja kanavapartneriohjelmiin. Tavoitteena oli kerätä mahdollisimman tarkkaa ja ajankohtaista tietoa tutkimuksen kohteena olevien kilpailijayritysten kanavapartneriohjelmista. Tämä osoittautui varsin haastavaksi tehtäväksi. Tarpeeksi tietoa saatiin kuitenkin kerättyä sekä kirjallisuudesta että tehdyn kyselyn avulla, mikä mahdollisti alkuperäisenä tavoitteena olleen kilpailijavertailun sekä sen pohjalta tehdyt analyysit.
Resumo:
Terveydenhuollossa käytetään nykyisin informaatioteknologian (IT) mahdollisuuksia parantamaan hoidon laatua, vähentämään hoitoon liittyviä kuluja sekä yksinkertaistamaan ja selkeyttämään laakareiden työnkulkua. Tietojärjestelmät, jotka edustavat jokaisen IT-ratkaisun ydintä, täytyy kehittää täyttämään lukuisia vaatimuksia, ja yksi niistä on kyky integroitua saumattomasti toisten tietojärjestelmien kanssa. Järjestelmäintegraatio on kuitenkin yhä haastava tehtävä, vaikka sita varten on kehitetty useita standardeja. Tässä työssä kuvataan vastakehitetyn lääketieteellisen tietojärjestelmän liittymäratkaisu. Työssä pohditaan vaatimuksia, jotka tällaiselle sovellukselle asetetaan, ja myös tapa, jolla vaatimukset toteutuvat on esitetty. Liittymaratkaisu on jaettu kahteen osaan, tietojärjestelmaliittymään ja "liittymakoneeseen" (interfacing engine). Edellinen on käsittää perustoiminnallisuuden, jota tarvitaan vastaanottamaan ja lähettämään tietoa toisiin järjestelmiin, kun taas jälkimmäinen tarjoaa tuen tuotantoympäristössa käytettäville standardeille. Molempien osien suunnitelu on esitelty perusteellisesti tässä työssä. Ongelma ratkaistiin modulaarisen ja geneerisen suunnittelun avulla. Tämä lähestymistapa osoitetaan työssä kestäväksi ja joustavaksi ratkaisuksi, jota voidaan käyttää tarkastelemaan laajaa valikoimaa liittymäratkaisulle asetettuja vaatimuksia. Lisaksi osoitetaan kuinka tehty ratkaisu voidaan joustavuutensa ansiosta helposti mukauttaa vaatimuksiin, joita ei ole etukäteen tunnistettu, ja siten saavutetaan perusta myös tulevaisuuden tarpeille
Resumo:
Logistics management is increasingly being recognised by many companies to be of critical concern. The logistics function includes directly or indirectly many of the new areas for achieving or maintaining competitive advantage that companies have been forced to develop due to increasing competitive pressures. The key to achieving a competitive advantage is to manage the logistics function strategically which involves determining the most cost effective method of providing the necessary customer service levels from the many combinations of operating procedures in the areas of transportation, warehousing, order processing and information systems, production, and inventory management. In this thesis, a comprehensive distribution logistics strategic management process is formed by integrating the periodic strategic planning process with a continuous strategic issues management process. Strategic planning is used for defining the basic objectives for a company and assuring co operation and synergy between the different functions of a company while strategic issues management is used on a continuous basis in order to deal with environmental and internal turbulence. The strategic planning subprocess consists of the following main phases: (1) situational analyses, (2) defining the vision and strategic goals for the logistics function, (3) determining objectives and strategies, (4) drawing up tactical action plans, and (5) evaluating the implementation of the plans and making the needed adjustments. The aim of the strategic issues management subprocess is to continuously scan the environment and the organisation for early identification of the issues having a significant impact on the logistics function using the following steps: (1) the identification of trends, (2) assessing the impact and urgency of the identified trends, (3) assigning priorities to the issues, and (4) planning responses to the, issues. The Analytic Hierarchy Process (AHP) is a systematic procedure for structuring any problem. AHP is based on the following three principles: decomposition, comparative judgements, and synthesis of priorities. AHP starts by decomposing a complex, multicriteria problem into a hierarchy where each level consists of a few manageable elements which are then decomposed into another set of elements. The second step is to use a measurement methodology to establish priorities among the elements within each level of the hierarchy. The third step in using AHP is to synthesise the priorities of the elements to establish the overall priorities for the decision alternatives. In this thesis, decision support systems are developed for different areas of distribution logistics strategic management by applying the Analytic Hierarchy Process. The areas covered are: (1) logistics strategic issues management, (2) planning of logistic structure, (3) warehouse site selection, (4) inventory forecasting, (5) defining logistic action and development plans, (6) choosing a distribution logistics strategy, (7) analysing and selecting transport service providers, (8) defining the logistic vision and strategic goals, (9) benchmarking logistic performance, and (10) logistic service management. The thesis demonstrates the potential of AHP as a systematic and analytic approach to distribution logistics strategic management.
Resumo:
Strategic development of distribution networks plays a key role in the asset management in electricity distribution companies. Owing to the capital-intensive nature of the field and longspan operations of companies, the significance of a strategy is emphasised. A well-devised strategy combines awareness of challenges posed by the operating environment and the future targets of the distribution company. Economic regulation, ageing infrastructure, scarcity of resources and tightening supply requirements with challenges created by the climate change put a pressure on the strategy work. On the other hand, technology development related to network automation and underground cabling assists in answering these challenges. This dissertation aims at developing process knowledge and establishing a methodological framework by which key issues related to network development can be addressed. Moreover, the work develops tools by which the effects of changes in the operating environment on the distribution business can be analysed in the strategy work. To this end, the work discusses certain characteristics of the distribution business and describes the strategy process at a principle level. Further, the work defines the subtasks in the strategy process and presents the key elements in the strategy work and long-term network planning. The work delineates the factors having either a direct or indirect effect on strategic planning and development needs in the networks; in particular, outage costs constitute an important part of the economic regulation of the distribution business, reliability being thus a key driver in network planning. The dissertation describes the methodology and tools applied to cost and reliability analyses in the strategy work. The work focuses on determination of the techno-economic feasibility of different network development technologies; these feasibility surveys are linked to the economic regulation model of the distribution business, in particular from the viewpoint of reliability of electricity supply and allowed return. The work introduces the asset management system developed for research purposes and to support the strategy work, the calculation elements of the system and initial data used in the network analysis. The key elements of this asset management system are utilised in the dissertation. Finally, the study addresses the stages of strategic decision-making and compilation of investment strategies. Further, the work illustrates implementation of strategic planning in an actual distribution company environment.
Resumo:
The purpose of this thesis is to develop an environment or network that enables effective collaborative product structure management among stakeholders in each unit, throughout the entire product lifecycle and product data management. This thesis uses framework models as an approach to the problem. Framework model methods for development of collaborative product structure management are proposed in this study, there are three unique models depicted to support collaborative product structure management: organization model, process model and product model. In the organization model, the formation of product data management system (eDSTAT) key user network is specified. In the process model, development is based on the case company’s product development matrix. In the product model framework, product model management, product knowledge management and design knowledge management are defined as development tools and collaboration is based on web-based product structure management. Collaborative management is executed using all these approaches. A case study from an actual project at the case company is presented as an implementation; this is to verify the models’ applicability. A computer assisted design tool and the web-based product structure manager, have been used as tools of this collaboration with the support of the key user. The current PDM system, eDSTAT, is used as a piloting case for key user role. The result of this development is that the role of key user as a collaboration channel is defined and established. The key user is able to provide one on one support for the elevator projects. Also the management activities are improved through the application of process workflow by following criteria for each project milestone. The development shows effectiveness of product structure management in product lifecycle, improved production process by eliminating barriers (e.g. improvement of two-way communication) during design phase and production phase. The key user role is applicable on a global scale in the company.
Resumo:
This thesis consists of three main theoretical themes: quality of data, success of information systems, and metadata in data warehousing. Loosely defined, metadata is descriptive data about data, and, in this thesis, master data means reference data about customers, products etc. The objective of the thesis is to contribute to an implementation of a metadata management solution for an industrial enterprise. The metadata system incorporates a repository, integration, delivery and access tools, as well as semantic rules and procedures for master data maintenance. It targets to improve maintenance processes and quality of hierarchical master data in the case company’s informational systems. That should bring benefits to whole organization in improved information quality, especially in cross-system data consistency, and in more efficient and effective data management processes. As the result of this thesis, the requirements for the metadata management solution in case were compiled, and the success of the new information system and the implementation project was evaluated.
Resumo:
Data is the most important asset of a company in the information age. Other assets, such as technology, facilities or products can be copied or reverse-engineered, employees can be brought over, but data remains unique to every company. As data management topics are slowly moving from unknown unknowns to known unknowns, tools to evaluate and manage data properly are developed and refined. Many projects are in progress today to develop various maturity models for evaluating information and data management practices. These maturity models come in many shapes and sizes: from short and concise ones meant for a quick assessment, to complex ones that call for an expert assessment by experienced consultants. In this paper several of them, made not only by external inter-organizational groups and authors, but also developed internally at a Major Energy Provider Company (MEPC) are juxtaposed and thoroughly analyzed. Apart from analyzing the available maturity models related to Data Management, this paper also selects the one with the most merit and describes and analyzes using it to perform a maturity assessment in MEPC. The utility of maturity models is two-fold: descriptive and prescriptive. Besides recording the current state of Data Management practices maturity by performing the assessments, this maturity model is also used to chart the way forward. Thus, after the current situation is presented, analysis and recommendations on how to improve it based on the definitions of higher levels of maturity are given. Generally, the main trend observed was the widening of the Data Management field to include more business and “soft” areas (as opposed to technical ones) and the change of focus towards business value of data, while assuming that the underlying IT systems for managing data are “ideal”, that is, left to the purely technical disciplines to design and maintain. This trend is not only present in Data Management but in other technological areas as well, where more and more attention is given to innovative use of technology, while acknowledging that the strategic importance of IT as such is diminishing.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
This study was done for ABB Ltd. Motors and Generators business unit in Helsinki. In this study, global data movement in large businesses is examined from a product data management (PDM) and enterprise resource planning (ERP) point-of-view. The purpose of this study was to understand and map out how a large global business handles its data in a multiple site structure and how it can be applied in practice. This was done by doing an empirical interview study on five different global businesses with design locations in multiple countries. Their master data management (MDM) solutions were inspected and analyzed to understand which solution would best benefit a large global architecture with many design locations. One working solution is a transactional hub which negates the effects of multisite transfers and reduces lead times. Also, the requirements and limitations of the current MDM architecture were analyzed and possible reform ideas given.