47 resultados para Master data
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
In a networked business environment the visibility requirements towards the supply operations and customer interface has become tighter. In order to meet those requirements the master data of case company is seen as an enabler. However the current state of master data and its quality are not seen good enough to meet those requirements. In this thesis the target of research was to develop a process for managing master data quality as a continuous process and find solutions to cleanse the current customer and supplier data to meet the quality requirements defined in that process. Based on the theory of Master Data Management and data cleansing, small amount of master data was analyzed and cleansed using one commercial data cleansing solution available on the market. This was conducted in cooperation with the vendor as a proof of concept. In the proof of concept the cleansing solution’s applicability to improve the quality of current master data was proved. Based on those findings and the theory of data management the recommendations and proposals for improving the quality of data were given. In the results was also discovered that the biggest reasons for poor data quality is the lack of data governance in the company, and the current master data solutions and its restrictions.
Resumo:
Especially in global enterprises, key data is fragmented in multiple Enterprise Resource Planning (ERP) systems. Thus the data is inconsistent, fragmented and redundant across the various systems. Master Data Management (MDM) is a concept, which creates cross-references between customers, suppliers and business units, and enables corporate hierarchies and structures. The overall goal for MDM is the ability to create an enterprise-wide consistent data model, which enables analyzing and reporting customer and supplier data. The goal of the study was defining the properties and success factors of a master data system. The theoretical background was based on literature and the case consisted of enterprise specific needs and demands. The theoretical part presents the concept, background, and principles of MDM and then the phases of system planning and implementation project. Case consists of background, definition of as is situation, definition of project, evaluation criterions and concludes the key results of the thesis. In the end chapter Conclusions combines common principles with the results of the case. The case part ended up dividing important factors of the system in success factors, technical requirements and business benefits. To clarify the project and find funding for the project, business benefits have to be defined and the realization has to be monitored. The thesis found out six success factors for the MDM system: Well defined business case, data management and monitoring, data models and structures defined and maintained, customer and supplier data governance, delivery and quality, commitment, and continuous communication with business. Technical requirements emerged several times during the thesis and therefore those can’t be ignored in the project. Conclusions chapter goes through these factors on a general level. The success factors and technical requirements are related to the essentials of MDM: Governance, Action and Quality. This chapter could be used as guidance in a master data management project.
Resumo:
After decades of mergers and acquisitions and successive technology trends such as CRM, ERP and DW, the data in enterprise systems is scattered and inconsistent. Global organizations face the challenge of addressing local uses of shared business entities, such as customer and material, and at the same time have a consistent, unique, and consolidate view of financial indicators. In addition, current enterprise systems do not accommodate the pace of organizational changes and immense efforts are required to maintain data. When it comes to systems integration, ERPs are considered “closed” and expensive. Data structures are complex and the “out-of-the-box” integration options offered are not based on industry standards. Therefore expensive and time-consuming projects are undertaken in order to have required data flowing according to business processes needs. Master Data Management (MDM) emerges as one discipline focused on ensuring long-term data consistency. Presented as a technology-enabled business discipline, it emphasizes business process and governance to model and maintain the data related to key business entities. There are immense technical and organizational challenges to accomplish the “single version of the truth” MDM mantra. Adding one central repository of master data might prove unfeasible in a few scenarios, thus an incremental approach is recommended, starting from areas most critically affected by data issues. This research aims at understanding the current literature on MDM and contrasting it with views from professionals. The data collected from interviews revealed details on the complexities of data structures and data management practices in global organizations, reinforcing the call for more in-depth research on organizational aspects of MDM. The most difficult piece of master data to manage is the “local” part, the attributes related to the sourcing and storing of materials in one particular warehouse in The Netherlands or a complex set of pricing rules for a subsidiary of a customer in Brazil. From a practical perspective, this research evaluates one MDM solution under development at a Finnish IT solution-provider. By means of applying an existing assessment method, the research attempts at providing the company with one possible tool to evaluate its product from a vendor-agnostics perspective.
Resumo:
Diplomityöntavoitteena on tutkia, kuinka nimiketiedon hallinnalla voidaan parantaa kustannustehokkuutta projektiohjautuvassa toimitusketjussa. Työn kohdeyritys on Konecranes Oyj:n tytäryhtiö Konecranes Heavy Lifting Oy. Nimiketiedon hallinta liittyy läheisesti tuotetiedon hallintaan. Teoriaosassa käsitellään toimitusketjuympäristön tekijöitä, modulaarisuuden ja asiakaskohtaisuuden ongelmallisuutta sekä informaatiovirran vaikutuksia eri toiminnoissa. Yritysosassa vertaillaan konsernitason kahta liiketoiminta-aluetta strategiavalintojen, tuotteiden modulaarisuuden sekä tilaus-toimitusprosessissa liikkuvan nimikeinformaation perusteella. Diplomityön tuloksena annetaan suuntaviivat; nimikemassan eheytykseen, strategisten nimikkeiden tunnistamiseen ja määrittämiseen, nimikkeiden hallintaan sekä master-datan sijoittamiseen tietojärjestelmäympäristöön.
Resumo:
There are two main objects in this study: First, to prove the importance of data accuracy to the business success, and second, create a tool for observing and improving the accuracy of ERP systems production master data. Sub-objective is to explain the need for new tool in client company and the meaning of it for the company. In the theoretical part of this thesis the focus is in stating the importance of data accuracy in decision making and it's implications on business success. Also basics of manufacturing planning are introduced in order to explain the key vocabulary. In the empirical part the client company and its need for this study is introduced. New master data report is introduced, and finally, analysing the report and actions based on the results of analysis are explained. The main results of this thesis are finding the interdependence between data accuracy and business success, and providing a report for continuous master data improvement in the client company's ERP system.
Resumo:
This thesis consists of three main theoretical themes: quality of data, success of information systems, and metadata in data warehousing. Loosely defined, metadata is descriptive data about data, and, in this thesis, master data means reference data about customers, products etc. The objective of the thesis is to contribute to an implementation of a metadata management solution for an industrial enterprise. The metadata system incorporates a repository, integration, delivery and access tools, as well as semantic rules and procedures for master data maintenance. It targets to improve maintenance processes and quality of hierarchical master data in the case company’s informational systems. That should bring benefits to whole organization in improved information quality, especially in cross-system data consistency, and in more efficient and effective data management processes. As the result of this thesis, the requirements for the metadata management solution in case were compiled, and the success of the new information system and the implementation project was evaluated.
Resumo:
This study was done for ABB Ltd. Motors and Generators business unit in Helsinki. In this study, global data movement in large businesses is examined from a product data management (PDM) and enterprise resource planning (ERP) point-of-view. The purpose of this study was to understand and map out how a large global business handles its data in a multiple site structure and how it can be applied in practice. This was done by doing an empirical interview study on five different global businesses with design locations in multiple countries. Their master data management (MDM) solutions were inspected and analyzed to understand which solution would best benefit a large global architecture with many design locations. One working solution is a transactional hub which negates the effects of multisite transfers and reduces lead times. Also, the requirements and limitations of the current MDM architecture were analyzed and possible reform ideas given.
Resumo:
This research is looking to find out what benefits employees expect the organization of data governance gains for an organization and how it benefits implementing automated marketing capabilities. Quality and usability of the data are crucial for organizations to meet various business needs. Organizations have more data and technology available what can be utilized for example in automated marketing. Data governance addresses the organization of decision rights and accountabilities for the management of an organization’s data assets. With automated marketing it is meant sending a right message, to a right person, at a right time, automatically. The research is a single case study conducted in Finnish ICT-company. The case company was starting to organize data governance and implementing automated marketing capabilities at the time of the research. Empirical material is interviews of the employees of the case company. Content analysis is used to interpret the interviews in order to find the answers to the research questions. Theoretical framework of the research is derived from the morphology of data governance. Findings of the research indicate that the employees expect the organization of data governance among others to improve customer experience, to improve sales, to provide abilities to identify individual customer’s life-situation, ensure that the handling of the data is according to the regulations and improve operational efficiency. The organization of data governance is expected to solve problems in customer data quality that are currently hindering implementation of automated marketing capabilities.
Resumo:
Toiminnanohjausjärjestelmän implementointi ja sen mukanaan tuomat muutokset tuotekustannuslaskentaan asettavat haasteita yritykselle. Metallitoimialalla toimivassa yrityksessä on havaittu samat haasteet implementoitaessa SAP R/3 toiminnanohjausjärjestelmää ja sen tuotekustannuslaskentatoiminnallisuutta. SAP R/3 tuotekustannuslogiikka tarvitsee tietoa järjestelmän ulkopuolelta, minkä huomioimatta jättäminen vaikuttaa suoraan laskentatarkkuuteen. Diplomityössä kehitetään sekä standardoitu prosessi että laskentajärjestelmä, joiden avulla pystytään laskemaan tarvittavat niin toimintokustannukset eri teräspalvelukeskuksen kuormituspisteille kuin kustannustenvyörytysarvot. Lasketut arvot muodostavat tarvittavat tekijät SAP R/3 tuotekustannuslaskennan master dataan. Tavoitteena on edesauttaa läpinäkyvän kustannustiedon muodostumista. Diplomityö pohjautuu ns. vesiputousmalliin (SDLC). Ensin tunnistetaan reunaehdot ympäristöstä, jossa tuotekustannuslaskenta toteutetaan. Tämä asettaa joustamattomia komponentteja kehitettävälle laskentajärjestelmälle. Joustavat komponentit sen sijaan antavat vapautta laskentajärjestelmälle. Yhdistämällä joustamattomat ja joustavat komponentit saavutetaan järjestelmä, jolla voidaan täydentää SAP R/3 tuotekustannuslaskennan puutteellisuutta.
Resumo:
Työn tavoitteena oli suorittaa tekninen sekä toiminnallinen toteutettavuustarkastelu asiakasdataan kohdistuvien muutosten suhteen erään suuren metsäteollisuusyrityksen toiminnanohjausjärjestelmäprojektissa. Ensin selvitettiin toiminnanohjausjärjestelmän käsite, toteutettavuustarkastelun rooli IT -projektissa sekä toteutettavuustarkastelun suorittamisen vaiheet. Teoriaosuus koottiin pääasiassa kirjallisuuden ja internetlähteiden avulla.Empiirisessä osuudessa toteutettiin ja dokumentoitiin toteutettavuustarkastelu vaihe vaiheelta. Kohdetta lähestyttiin ylhäältä -alas menetelmällä. Pääpaino oli asiakaskoodien, järjestelmiin tarvittavien muutosten sekä työnkulun muuttamisen toteutettavuuden tarkastelussa. Empiirisen osuuden informaation kerääminen suoritettiin asiantuntijoiden haastattelujen avulla. Kerättyä tietoa sovellettiin teoriaosuuden sekä projektidokumenttien menetelmien mukaisesti. Lopputulokseksi saatiin täysivaltainen toteutettavuustarkastelu, joka pitää sisällään kohteen taustaselvitykset, nykyisen ja tulevan tilanteen kuvaukset, poikkeavuusanalyysit sekä jatkotoimenpide-ehdotukset. Tutkimuksessa päädyttiin siihen, että projekti on toteutettavissa tutkitulta alueelta.
Resumo:
Inventory data management is defined as an accurate creation and maintenance of item master data, inventory location data and inventory balances per an inventory location. The accuracy of inventory data is dependent of many elements of the product data management like the management of changes during a component’s life-cycle and the accuracy of product configuration in enterprise resource planning systems. Cycle-counting is counting of inventory balances per an inventory location and comparing them to the system data on a daily basis. The Cycle-counting process is a way to measure the accuracy of the inventory data in a company. Through well managed cycle-counting process a company gets a lot of information of their inventory data accuracy. General inaccuracy of the inventory data can’t be fixed through only assigning resources to the cycle-counting but the change requires disciplined following of the defined processes from all parties involved in updating inventory data through a component’s life-cycle. The Processes affecting inventory data are mapped and the appropriate metrics are defined in order to achieve better manageability of the inventory data. The Life-cycles of a single component and of a product are used in evaluation of the processes.
Resumo:
Tämä diplomityö käsittelee teollisen yrityksen tuotannonohjauksen kehittämistä piensarjatuotannossa. Työn kohteena on ABB Oy:n Tuulivoimageneraattorit-tulosyksikkö, joka valmistaa vakiotuotteita asiakasohjautuvasti. Työssä esitellään aluksi tuotannon ja tuotannonohjauksen teoriaa. Lävitse käydään perusasioiden kuten määritelmien, tavoitteiden ja tehtävien lisäksi tuotannonohjausprosessia sekä tuotannonohjauksen tietotekniikkaa. Teorian jälkeisessä empiriaosuudessa esitellään työssä kehitettyjä keinoja tuotannonohjauksen parantamiseksi. Tutkimus on toteutettu teoreettisen ja empiirisen tutkimustyön avulla. Teoreettiseen tutkimustyöhön sisältyi suomalaisiin ja ulkomaalaisiin kirjallisuuslähteisiin perehtyminen. Empiirinen tutkimustyö suoritettiin itsenäisen ongelman ratkaisutyön avulla. Tämä sisälsi kehittämiskohteiden analysoinnin, tarkempien kehittämistarpeiden määrityksen sekä kokeilujen kautta tapahtuneen kehittämistyön. Tutkimuksen päätavoitteena oli selvittää, miten tuotannonohjauksen kehittämisellä voidaan parantaa kohteena olevan tulosyksikön tuottavuutta ja kannattavuutta. Päätavoitteen pohjalta muodostettiin kuusi osatavoitetta: toimitusvarmuuden parantaminen, kapasiteetin kuormitusasteen nostaminen, kapasiteetin suunnittelun kehittäminen, läpäisyaikojen lyhentäminen, uuden ERP-järjestelmän vaatimusmäärittely sekä tuotannonohjausprosessin määrittäminen. Työssä rakennettiin neljään ensiksi mainittuun osatavoitteeseen tietotekniset sovellukset, jotka mahdollistavat osatavoitteiden suunnittelun ja ohjaamisen. Sovelluksia varten kullekin tuotteelle määriteltiin esimerkiksi työnvaiheketjut läpäisyaikoineen, kuormitusryhmät, kuormitusryhmien kapasiteetit, tuotteiden kuormittavuudet sekä kriittiset työvälineet. Työ osoitti, että tietotekniikka auttaa suuresti tuotannonohjauksessa. Lisääntynyt läpinäkyvyys, parantunut tiedonkulku, simulointimahdollisuudet sekä graafinen esitystapa helpottavat erilaisten suunnitelmien teossa ja parantavat siten päätöksenteon laatua. Tietotekniikan hyväksikäytön pohjana toimii tuotannon perus- ja tapahtumatietojen kurinalainen päivitys. Tämän vuoksi tietojärjestelmistä kannattaa rakentaa mahdollisimman yksinkertaisia.
Resumo:
This diploma thesis has been done to international organization which takes care from the accounting actions of two major companies. In this organization are used three different purchasing tools which are used when new asset master data is wanted to input to SAP R/3- system. The aim of this thesis is to find out how much changing the user interface of one of these three e-procurement programs will affect to overall efficiency in asset accounting. As an addition will be introduced project framework which can be used in future projects and which help to avoid certain steps in the development process. At the moment data needs to be inputted manually with many useless mouse clicks and data needs to be searched from many various resources which slow down the process. Other organization has better tools at the moment than the myOrders system which is under investigation Research was started by exploring the main improvement areas. After this possible defects were traced. Suggested improvements were thought by exploring literature which has been written from usability design and research. Meanwhile also directional calculations from the benefits of the project were done alongside with the analysis of the possible risks and threats. After this NSN IT approved the changes which they thought was acceptable. The next step was to program them into tool and test them before releasing to production environment. The calculations were made also from implemented improvements and compared them to planned ones From whole project was made a framework which can be utilized also to other similar projects. The complete calculation was not possible because of time schedule of the project. Important observation in the project was that efficiency is not improved not only by changing the GUI but also improving processes without any programming. Feedback from end user should be also listened more in development process. End-user is after all the one who knows the best how the program should look like.
Resumo:
Companies and organizations business activities rely on information. IT systems that manage master data must provide the highest level of service, world-class scalability and reliable information. This study discusses product information and how it can support the creation of a spare part catalog and the launching of eBusiness. The study consists of a theoretical and empirical part. The theoretical part contains a literature review and a framework for the analysis. For the empirical study two companies were selected and their information management processes were studied and analyzed based on the framework. The empirical results indicate that the challenges the companies face reflect the ones that can be found in the literature study. The results of the empirical study also show that the companies had recognized the issues which need to be developed and had recognized trends in eBusiness and product information management.
Resumo:
This work is devoted to the problem of reconstructing the basis weight structure at paper web with black{box techniques. The data that is analyzed comes from a real paper machine and is collected by an o®-line scanner. The principal mathematical tool used in this work is Autoregressive Moving Average (ARMA) modelling. When coupled with the Discrete Fourier Transform (DFT), it gives a very flexible and interesting tool for analyzing properties of the paper web. Both ARMA and DFT are independently used to represent the given signal in a simplified version of our algorithm, but the final goal is to combine the two together. Ljung-Box Q-statistic lack-of-fit test combined with the Root Mean Squared Error coefficient gives a tool to separate significant signals from noise.