70 resultados para Data anonymization and sanitization

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn tarkoituksena oli kerätä käyttövarmuustietoa savukaasulinjasta kahdelta suomalaiselta sellutehtaalta niiden käyttöönotosta aina tähän päivään asti. Käyttövarmuustieto koostuu luotettavuustiedoista sekä kunnossapitotiedoista. Kerätyn tiedon avulla on mahdollista kuvata tarkasti laitoksen käyttövarmuutta seuraavilla tunnusluvuilla: suunnittelemattomien häiriöiden lukumäärä ja korjausajat, laitteiden seisokkiaika, vikojen todennäköisyys ja korjaavan kunnossapidon kustannukset suhteessa savukaasulinjan korjaavan kunnossapidon kokonaiskustannuksiin. Käyttövarmuustiedon keräysmetodi on esitelty. Savukaasulinjan kriittisten laitteiden määrittelyyn käytetty metodi on yhdistelmä kyselytutkimuksesta ja muunnellusta vian vaikutus- ja kriittisyysanalyysistä. Laitteiden valitsemiskriteerit lopulliseen kriittisyysanalyysiin päätettiin käyttövarmuustietojen sekä kyselytutkimuksen perusteella. Kriittisten laitteiden määrittämisen tarkoitus on löytää savukaasulinjasta ne laitteet, joiden odottamaton vikaantuminen aiheuttaa vakavimmat seuraukset savukaasulinjan luotettavuuteen, tuotantoon, turvallisuuteen, päästöihin ja kustannuksiin. Tiedon avulla rajoitetut kunnossapidon resurssit voidaan suunnata oikein. Kriittisten laitteiden määrittämisen tuloksena todetaan, että kolme kriittisintä laitetta savukaasulinjassa ovat molemmille sellutehtaille yhteisesti: savukaasupuhaltimet, laahakuljettimet sekä ketjukuljettimet. Käyttövarmuustieto osoittaa, että laitteiden luotettavuus on tehdaskohtaista, mutta periaatteessa samat päälinjat voidaan nähdä suunnittelemattomien vikojen todennäköisyyttä esittävissä kuvissa. Kustannukset, jotka esitetään laitteen suunnittelemattomien kunnossapitokustannusten suhteena savukaasulinjan kokonaiskustannuksiin, noudattelevat hyvin pitkälle luotettavuuskäyrää, joka on laskettu laitteen seisokkiajan suhteena käyttötunteihin. Käyttövarmuustiedon keräys yhdistettynä kriittisten laitteiden määrittämiseen mahdollistavat ennakoivan kunnossapidon oikean kohdistamisen ja ajoittamisen laitteiston elinaikana siten, että luotettavuus- ja kustannustehokkuusvaatimukset saavutetaan.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Open data refers to publishing data on the web in machine-readable formats for public access. Using open data, innovative applications can be developed to facilitate people‟s lives. In this thesis, based on the open data cases (discussed in the literature review), Open Data Lappeenranta is suggested, which publishes open data related to opening hours of shops and stores in Lappeenranta City. To prove the possibility of creating Open Data Lappeenranta, the implementation of an open data system is presented in this thesis, which publishes specific data related to shops and stores (including their opening hours) on the web in standard format (JSON). The published open data is used to develop web and mobile applications to demonstrate the benefits of open data in practice. Also, the open data system provides manual and automatic interfaces which make it possible for shops and stores to maintain their own data in the system. Finally in this thesis, the completed version of Open Data Lappeenranta is proposed, which publishes open data related to other fields and businesses in Lappeenranta beyond only stores‟ data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Workshop at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research is looking to find out what benefits employees expect the organization of data governance gains for an organization and how it benefits implementing automated marketing capabilities. Quality and usability of the data are crucial for organizations to meet various business needs. Organizations have more data and technology available what can be utilized for example in automated marketing. Data governance addresses the organization of decision rights and accountabilities for the management of an organization’s data assets. With automated marketing it is meant sending a right message, to a right person, at a right time, automatically. The research is a single case study conducted in Finnish ICT-company. The case company was starting to organize data governance and implementing automated marketing capabilities at the time of the research. Empirical material is interviews of the employees of the case company. Content analysis is used to interpret the interviews in order to find the answers to the research questions. Theoretical framework of the research is derived from the morphology of data governance. Findings of the research indicate that the employees expect the organization of data governance among others to improve customer experience, to improve sales, to provide abilities to identify individual customer’s life-situation, ensure that the handling of the data is according to the regulations and improve operational efficiency. The organization of data governance is expected to solve problems in customer data quality that are currently hindering implementation of automated marketing capabilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research concerns the Urban Living Idea Contest conducted by Creator Space™ of BASF SE during its 150th anniversary in 2015. The main objectives of the thesis are to provide a comprehensive analysis of the Urban Living Idea Contest (ULIC) and propose a number of improvement suggestions for future years. More than 4,000 data points were collected and analyzed to investigate the functionality of different elements of the contest. Furthermore, a set of improvement suggestions were proposed to BASF SE. Novelty of this thesis lies in the data collection and the original analysis of the contest, which identified its critical elements, as well as the areas that could be improved. The author of this research was a member of the organizing team and involved in the decision making process from the beginning until the end of the ULIC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Availability, Data Privacy and Copyrights – Opening Knowledge via Contracts and Pilots, discusses how in Aviisi-project of National Library of Finland, the digital contents, and their availability topics dealt together with pilot organizations

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Protection of innovation in the pharmaceutical industry has traditionally been realised through protection of inventions via patents. However, in the European Union regulatory exclusivities restricting market entry of generic products confer tailored, industry specific protection for final, marketable products. This paper retraces the protection conferred by the different forms of exclusivity and assesses them in the light of recent transparency policies of the European Medicines Agency. The purpose of the paper is to argue for rethinking the role of regulatory data as a key tool of innovation policy and for refocusing the attention from patents to the existing regulatory framework. After detailed assessment of the exclusivity regime, the paper identifies key areas of improvement calling for reassessment so as to promote better functioning of the regime as an incentive for accelerated innovation. While economic and public health analysis necessarily provide final answers as to necessity of reform, this paper provides a legal perspective to the issue, appraising the current regulatory framework and identifying areas for further analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kiihtyvä kilpailu yritysten välillä on tuonut yritykset vaikeidenhaasteiden eteen. Tuotteet pitäisi saada markkinoille nopeammin, uusien tuotteiden pitäisi olla parempia kuin vanhojen ja etenkin parempia kuin kilpailijoiden vastaavat tuotteet. Lisäksi tuotteiden suunnittelu-, valmistus- ja muut kustannukset eivät saisi olla suuria. Näiden haasteiden toteuttamisessa yritetään usein käyttää apuna tuotetietoja, niiden hallintaa ja vaihtamista. Andritzin, kuten muidenkin yritysten, on otettava nämä asiat huomioon pärjätäkseen kilpailussa. Tämä työ on tehty Andritzille, joka on maailman johtavia paperin ja sellun valmistukseen tarkoitettujen laitteiden valmistajia ja huoltopalveluiden tarjoajia. Andritz on ottamassa käyttöön ERP-järjestelmän kaikissa toimipisteissään. Sitä halutaan hyödyntää mahdollisimman tehokkaasti, joten myös tuotetiedot halutaan järjestelmään koko elinkaaren ajalta. Osan tuotetiedoista luo Andritzin kumppanit ja alihankkijat, joten myös tietojen vaihto partnereiden välillä halutaan hoitaasiten, että tiedot saadaan suoraan ERP-järjestelmään. Tämän työn tavoitteena onkin löytää ratkaisu, jonka avulla Andritzin ja sen kumppaneiden välinen tietojenvaihto voidaan hoitaa. Tämä diplomityö esittelee tuotetietojen, niiden hallinnan ja vaihtamisen tarkoituksen ja tärkeyden. Työssä esitellään erilaisia ratkaisuvaihtoehtoja tiedonvaihtojärjestelmän toteuttamiseksi. Osa niistä perustuu yleisiin ja toimialakohtaisiin standardeihin. Myös kaksi kaupallista tuotetta esitellään. Tarkasteltavana onseuraavat standardit: PaperIXI, papiNet, X-OSCO, PSK-standardit sekä RosettaNet. Lisäksi työssä tarkastellaan ERP-järjestelmän toimittajan, SAP:in ratkaisuja tietojenvaihtoon. Näistä vaihtoehdoista parhaimpia tarkastellaan vielä yksityiskohtaisemmin ja lopuksi eri ratkaisuja vertaillaan keskenään, jotta löydettäisiin Andritzin tarpeisiin paras vaihtoehto.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are two main objects in this study: First, to prove the importance of data accuracy to the business success, and second, create a tool for observing and improving the accuracy of ERP systems production master data. Sub-objective is to explain the need for new tool in client company and the meaning of it for the company. In the theoretical part of this thesis the focus is in stating the importance of data accuracy in decision making and it's implications on business success. Also basics of manufacturing planning are introduced in order to explain the key vocabulary. In the empirical part the client company and its need for this study is introduced. New master data report is introduced, and finally, analysing the report and actions based on the results of analysis are explained. The main results of this thesis are finding the interdependence between data accuracy and business success, and providing a report for continuous master data improvement in the client company's ERP system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of the work was to realize a high-speed digital data transfer system for RPC muon chambers in the CMS experiment on CERN’s new LHC accelerator. This large scale system took many years and many stages of prototyping to develop, and required the participation of tens of people. The system interfaces to Frontend Boards (FEB) at the 200,000-channel detector and to the trigger and readout electronics in the control room of the experiment. The distance between these two is about 80 metres and the speed required for the optic links was pushing the limits of available technology when the project was started. Here, as in many other aspects of the design, it was assumed that the features of readily available commercial components would develop in the course of the design work, just as they did. By choosing a high speed it was possible to multiplex the data from some the chambers into the same fibres to reduce the number of links needed. Further reduction was achieved by employing zero suppression and data compression, and a total of only 660 optical links were needed. Another requirement, which conflicted somewhat with choosing the components a late as possible was that the design needed to be radiation tolerant to an ionizing dose of 100 Gy and to a have a moderate tolerance to Single Event Effects (SEEs). This required some radiation test campaigns, and eventually led to ASICs being chosen for some of the critical parts. The system was made to be as reconfigurable as possible. The reconfiguration needs to be done from a distance as the electronics is not accessible except for some short and rare service breaks once the accelerator starts running. Therefore reconfigurable logic is extensively used, and the firmware development for the FPGAs constituted a sizable part of the work. Some special techniques needed to be used there too, to achieve the required radiation tolerance. The system has been demonstrated to work in several laboratory and beam tests, and now we are waiting to see it in action when the LHC will start running in the autumn 2008.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Especially in global enterprises, key data is fragmented in multiple Enterprise Resource Planning (ERP) systems. Thus the data is inconsistent, fragmented and redundant across the various systems. Master Data Management (MDM) is a concept, which creates cross-references between customers, suppliers and business units, and enables corporate hierarchies and structures. The overall goal for MDM is the ability to create an enterprise-wide consistent data model, which enables analyzing and reporting customer and supplier data. The goal of the study was defining the properties and success factors of a master data system. The theoretical background was based on literature and the case consisted of enterprise specific needs and demands. The theoretical part presents the concept, background, and principles of MDM and then the phases of system planning and implementation project. Case consists of background, definition of as is situation, definition of project, evaluation criterions and concludes the key results of the thesis. In the end chapter Conclusions combines common principles with the results of the case. The case part ended up dividing important factors of the system in success factors, technical requirements and business benefits. To clarify the project and find funding for the project, business benefits have to be defined and the realization has to be monitored. The thesis found out six success factors for the MDM system: Well defined business case, data management and monitoring, data models and structures defined and maintained, customer and supplier data governance, delivery and quality, commitment, and continuous communication with business. Technical requirements emerged several times during the thesis and therefore those can’t be ignored in the project. Conclusions chapter goes through these factors on a general level. The success factors and technical requirements are related to the essentials of MDM: Governance, Action and Quality. This chapter could be used as guidance in a master data management project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

http://elo.aalto.fi/fi/studies/elomedia/dataseminar/